Citation impact

Citation impact quantifies the citation usage of scholarly works.[1][2][3][4][5] It is a result of citation analysis or bibliometrics. Among the measures that have emerged from citation analysis are the citation counts for an individual article, an author, and an academic journal.

Article-level

One of the most basic citation metrics is how often an article was cited in other articles, books, or other sources (such as theses). Citation rates are heavily dependent on the discipline and the number of people working in that area. For instance, many more scientists work in neuroscience than in mathematics, and neuroscientists publish more papers than mathematicians, hence neuroscience papers are much more often cited than papers in mathematics.[6][7] Similarly, review papers are more often cited than regular research papers because they summarize results from many papers. This may also be the reason why papers with shorter titles get more citations, given that they are usually covering a broader area.[8]

Most-cited papers

The most-cited paper of all time is the paper by Oliver Lowry describing an assay to measure the concentration of proteins.[9] By 2014 it had accumulated more than 305,000 citations. The 10 most cited papers all had more than 40,000 citations.[10] To reach the top-100 papers required 12,119 citations by 2014.[10] Of Thomson Reuter’s Web of Science database with more than 58 million items only 14,499 papers (~0.026%) had more than 1,000 citations in 2014.[10]

Journal-level

Journal impact factor Nature Plos One
Journal impact factors are influenced heavily by a small number of highly cited papers. In general, most papers published in 2013–14 received many fewer citations than indicated by the impact factor. Two journals (Nature [blue], PLOS One [orange]) are shown to represent a highly cited and less cited journal, respectively. Note that the high citation impact of Nature is derived from relatively few highly cited papers. Modified after Callaway 2016.[11]

Journal impact factors (JIFs) are a measure of the average number of citations that articles published by a journal in the previous two years have received in the current year. However, journals with very high impact factors are often based on a small number of very highly cited papers. For instance, most papers in Nature (impact factor 38.1, 2016) were "only" cited 10 or 20 times during the reference year (see figure). Journals with a "low" impact (e.g. PLOS One, impact factor 3.1) publish many papers that are cited 0 to 5 times but few highly cited articles.[11]

JIFs are often mis-interpreted as a measure for journal quality or even article quality. The JIF is a journal-level metric, not an article-level metric, hence its use to determine the impact of a single article is statistically invalid. Citation distribution is skewed for journals because a very small number of articles is driving the vast majority of citations (see figure). Therefore, some journals have stopped publicizing their impact factor, e.g. the journals of the American Society for Microbiology.[12]

Author-level

Total citations, or average citation count per article, can be reported for an individual author or researcher. Many other measures have been proposed, beyond simple citation counts, to better quantify an individual scholar's citation impact.[13] The best-known measures include the h-index[14] and the g-index.[15] Each measure has advantages and disadvantages,[16] spanning from bias to discipline-dependence and limitations of the citation data source.[17] Counting the number of citations per paper is also employed to identify the authors of citation classics.[18]

Alternatives

An alternative approach to measure a scholar's impact relies on usage data, such as number of downloads from publishers and analyzing citation performance, often at article level.[19][20][21][22]

As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations.[23] In 2008 the Journal of Medical Internet Research began publishing views and Tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.[24]

In response to growing concerns over the inappropriate use of journal impact factors in evaluating scientific outputs and scientists themselves, Université de Montréal, Imperial College London, PLOS, eLife, EMBO Journal, The Royal Society, Nature and Science proposed citation distributions metrics as alternative to impact factors.[25][26][27]

Open Access publications

Open access publications are available without cost to readers, hence they should be cited more frequently.[28][29][30][31][32][33][34][35] While this has been contradicted by some experimental and observational studies[36][37] recent evidence suggests that  OA journals were indeed found to have significantly more citations overall compared to non-OA journals (median 15.5 vs 12). Thus, it is better to publish in an OA journal for more citations.[38]

Recent developments

An important recent development in research on citation impact is the discovery of universality, or citation impact patterns that hold across different disciplines in the sciences, social sciences, and humanities. For example, it has been shown that the number of citations received by a publication, once properly rescaled by its average across articles published in the same discipline and in the same year, follows a universal log-normal distribution that is the same in every discipline.[39] This finding has suggested a universal citation impact measure that extends the h-index by properly rescaling citation counts and resorting publications, however the computation of such a universal measure requires the collection of extensive citation data and statistics for every discipline and year. Social crowdsourcing tools such as Scholarometer have been proposed to address this need.[40][41] Kaur et al. proposed a statistical method to evaluate the universality of citation impact metrics, i.e., their capability to compare impact fairly across fields.[42] Their analysis identifies universal impact metrics, such as the field-normalized h-index.

Research suggests the impact of an article can be, partly, explained by superficial factors and not only by the scientific merits of an article.[43] Field-dependent factors are usually listed as an issue to be tackled not only when comparison across disciplines are made, but also when different fields of research of one discipline are being compared.[44] For instance in Medicine among other factors the number of authors, the number of references, the article length, and the presence of a colon in the title influence the impact. Whilst in Sociology the number of references, the article length, and title length are among the factors.[45] Also it is suggested scholars engage in ethical questionable behavior in order to inflate the number of citations articles receive.[46]

Automated citation indexing[47] has changed the nature of citation analysis research, allowing millions of citations to be analyzed for large scale patterns and knowledge discovery. The first example of automated citation indexing was CiteSeer, later to be followed by Google Scholar. More recently, advanced models for a dynamic analysis of citation aging have been proposed.[48][49] The latter model is even used as a predictive tool for determining the citations that might be obtained at any time of the lifetime of a corpus of publications.

According to Mario Biagioli: "All metrics of scientific evaluation are bound to be abused. Goodhart's law [...] states that when a feature of the economy is picked as an indicator of the economy, then it inexorably ceases to function as that indicator because people start to game it."[50]

See also

References

  1. ^ Garfield, E. (1955). "Citation Indexes for Science: A New Dimension in Documentation through Association of Ideas". Science. 122 (3159): 108. Bibcode:1955Sci...122..108G. doi:10.1126/science.122.3159.108. PMID 14385826.
  2. ^ Garfield, E. (1973). "Citation Frequency as a Measure of Research Activity and Performance" (PDF). Essays of an Information Scientist. 1: 406–408.
  3. ^ Garfield, E. (1988). "Can Researchers Bank on Citation Analysis?" (PDF). Essays of an Information Scientist. 11: 354.
  4. ^ Garfield, E. (1998). "The use of journal impact factors and citation analysis in the evaluation of science". 41st Annual Meeting of the Council of Biology Editors.
  5. ^ Moed, Henk F. (2005). Citation Analysis in Research Evaluation. Springer. ISBN 978-1-4020-3713-9.
  6. ^ de Solla Price, D. J. (1963). Little Science, Big Science. Columbia University Press.
  7. ^ Larsen, P. O.; von Ins, M. (2010). "The rate of growth in scientific publication and the decline in coverage provided by Science Citation Index". Scientometrics. 84 (3): 575–603. doi:10.1007/s11192-010-0202-z.
  8. ^ Deng, B. (26 August 2015). "Papers with shorter titles get more citations". Nature News. doi:10.1038/nature.2015.18246.
  9. ^ Lowry, O. H.; Rosebrough, N. J.; Farr, A. L.; Randall, R. J. (1951). "Protein measurement with the Folin phenol reagent". The Journal of Biological Chemistry. 193 (1): 265–275. PMID 14907713.
  10. ^ a b c van Noorden, R.; Maher, B.; Nuzzo, R. (2014). "The top 100 papers". Nature. 514 (7524): 550–553. Bibcode:2014Natur.514..550V. doi:10.1038/514550a. PMID 25355343.
  11. ^ a b Callaway, E. (2016). "Beat it, impact factor! Publishing elite turns against controversial metric". Nature. 535 (7611): 210–211. Bibcode:2016Natur.535..210C. doi:10.1038/nature.2016.20224. PMID 27411614.
  12. ^ Casadevall, A.; Bertuzzi, S.; Buchmeier, M. J.; Davis, R. J.; Drake, H.; Fang, F. C.; Gilbert, J.; Goldman, B. M.; Imperiale, M. J. (2016). "ASM Journals Eliminate Impact Factor Information from Journal Websites". mSphere. 1 (4): e00184–16. doi:10.1128/mSphere.00184-16. PMC 4941020. PMID 27408939.
  13. ^ Belikov, A. V.; Belikov, V. V. (2015). "A citation-based, author- and age-normalized, logarithmic index for evaluation of individual researchers independently of publication counts". F1000Research. 4: 884. doi:10.12688/f1000research.7070.1.
  14. ^ Hirsch, J. E. (2005). "An index to quantify an individual's scientific research output". PNAS. 102 (46): 16569–16572. arXiv:physics/0508025. Bibcode:2005PNAS..10216569H. doi:10.1073/pnas.0507655102. PMC 1283832. PMID 16275915.
  15. ^ Egghe, L. (2006). "Theory and practise of the g-index". Scientometrics. 69 (1): 131–152. doi:10.1007/s11192-006-0144-7.
  16. ^ Gálvez RH (March 2017). "Assessing author self-citation as a mechanism of relevant knowledge diffusion". Scientometrics. doi:10.1007/s11192-017-2330-1.
  17. ^ Couto, F. M.; Pesquita, C.; Grego, T.; Veríssimo, P. (2009). "Handling self-citations using Google Scholar". Cybermetrics. 13 (1): 2.
  18. ^ Serenko, A.; Dumay, J. (2015). "Citation classics published in knowledge management journals. Part I: Articles and their characteristics" (PDF). Journal of Knowledge Management. 19 (2): 401–431. doi:10.1108/JKM-06-2014-0220.
  19. ^ Bollen, J.; Van de Sompel, H.; Smith, J.; Luce, R. (2005). "Toward alternative metrics of journal impact: A comparison of download and citation data". Information Processing and Management. 41 (6): 1419–1440. arXiv:cs.DL/0503007. doi:10.1016/j.ipm.2005.03.024.
  20. ^ Brody, T.; Harnad, S.; Carr, L. (2005). "Earlier Web Usage Statistics as Predictors of Later Citation Impact". Journal of the Association for Information Science and Technology. 57 (8): 1060. arXiv:cs/0503020. Bibcode:2005cs........3020B. doi:10.1002/asi.20373.
  21. ^ Kurtz, M. J.; Eichhorn, G.; Accomazzi, A.; Grant, C.; Demleitner, M.; Murray, S. S. (2004). "The Effect of Use and Access on Citations". Information Processing and Management. 41 (6): 1395–1402. arXiv:cs/0503029. Bibcode:2005IPM....41.1395K. doi:10.1016/j.ipm.2005.03.010.
  22. ^ Moed, H. F. (2005b). "Statistical Relationships Between Downloads and Citations at the Level of Individual Documents Within a Single Journal". Journal of the American Society for Information Science and Technology. 56 (10): 1088–1097. doi:10.1002/asi.20200.
  23. ^ Perneger, T. V. (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ. 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC 516105. PMID 15345629.
  24. ^ Eysenbach, G. (2011). "Can Tweets Predict Citations? Metrics of Social Impact Based on Twitter and Correlation with Traditional Metrics of Scientific Impact". Journal of Medical Internet Research. 13 (4): e123. doi:10.2196/jmir.2012. PMC 3278109. PMID 22173204.
  25. ^ Veronique Kiermer (2016). "Measuring Up: Impact Factors Do Not Reflect Article Citation Rates". The Official PLOS Blog.
  26. ^ "Ditching Impact Factors for Deeper Data". The Scientist. Retrieved 2016-07-29.
  27. ^ "Scientific publishing observers and practitioners blast the JIF and call for improved metrics". Physics Today. Retrieved 2016-03-08.
  28. ^ Bibliography of Findings on the Open Access Impact Advantage
  29. ^ Brody, T.; Harnad, S. (2004). "Comparing the Impact of Open Access (OA) vs. Non-OA Articles in the Same Journals". D-Lib Magazine. 10: 6.
  30. ^ Eysenbach, G.; Tenopir, C. (2006). "Citation Advantage of Open Access Articles". PLoS Biology. 4 (5): e157. doi:10.1371/journal.pbio.0040157. PMC 1459247. PMID 16683865.
  31. ^ Eysenbach, G. (2006). "The Open Access Advantage". Journal of Medical Internet Research. 8 (2): e8. doi:10.2196/jmir.8.2.e8.
  32. ^ Hajjem, C.; Harnad, S.; Gingras, Y. (2005). "Ten-Year Cross-Disciplinary Comparison of the Growth of Open Access and How It Increases Research Citation Impact" (PDF). IEEE Data Engineering Bulletin. 28 (4): 39–47. arXiv:cs/0606079. Bibcode:2006cs........6079H.
  33. ^ Lawrence, S. (2001). "Free online availability substantially increases a paper's impact". Nature. 411 (6837): 521–521. doi:10.1038/35079151. PMID 11385534.
  34. ^ MacCallum, C. J.; Parthasarathy, H. (2006). "Open Access Increases Citation Rate". PLoS Biology. 4 (5): e176. doi:10.1371/journal.pbio.0040176.
  35. ^ Gargouri, Y.; Hajjem, C.; Lariviere, V.; Gingras, Y.; Brody, T.; Carr, L.; Harnad, S. (2010). "Self-Selected or Mandated, Open Access Increases Citation Impact for Higher Quality Research". PLOS One. 5 (10): e13636. arXiv:1001.0361. Bibcode:2010PLoSO...513636G. doi:10.1371/journal.pone.0013636. PMC 2956678. PMID 20976155.
  36. ^ Davis, P. M.; Lewenstein, B. V.; Simon, D. H.; Booth, J. G.; Connolly, M. J. L. (2008). "Open access publishing, article downloads, and citations: randomised controlled trial". BMJ. 337: a568–a568. doi:10.1136/bmj.a568. PMC 2492576. PMID 18669565.
  37. ^ Davis, P. M. (2011). "Open access, readership, citations: a randomized controlled trial of scientific journal publishing". The FASEB Journal. 25 (7): 2129–2134. doi:10.1096/fj.11-183988. PMID 21450907.
  38. ^ Chua, SK; Qureshi, Ahmad M; Krishnan, Vijay; Pai, Dinker R; Kamal, Laila B; Gunasegaran, Sharmilla; Afzal, MZ; Ambawatta, Lahiru; Gan, JY (2017-03-02). "The impact factor of an open access journal does not contribute to an article's citations". F1000Research. 6: 208. doi:10.12688/f1000research.10892.1.
  39. ^ Radicchi, F.; Fortunato, S.; Castellano, C. (2008). "Universality of citation distributions: Toward an objective measure of scientific impact". PNAS. 105 (45): 17268–17272. arXiv:0806.0974. Bibcode:2008PNAS..10517268R. doi:10.1073/pnas.0806977105. PMC 2582263. PMID 18978030.
  40. ^ Hoang, D.; Kaur, J.; Menczer, F. (2010). "Crowdsourcing Scholarly Data" (PDF). Proceedings of the WebSci10: Extending the Frontiers of Society On-Line.
  41. ^ Kaur, J.; Hoang, D.; Sun, X.; Possamai, L.; JafariAsbagh, M.; Patil, S.; Menczer, F. (2012). "Scholarometer: A Social Framework for Analyzing Impact across Disciplines". PLOS One. 7 (9): e43235. Bibcode:2012PLoSO...743235K. doi:10.1371/journal.pone.0043235.
  42. ^ Kaur, J.; Radicchi, F.; Menczer, F. (2013). "Universality of scholarly impact metrics". Journal of Informetrics. 7 (4): 924–932. doi:10.1016/j.joi.2013.09.002.
  43. ^ Bornmann, L.; Daniel, H. D. (2008). "What do citation counts measure? A review of studies on citing behavior". Journal of Documentation. 64 (1): 45–80. doi:10.1108/00220410810844150.
  44. ^ Anauati, M. V.; Galiani, S.; Gálvez, R. H. (2014). "Quantifying the Life Cycle of Scholarly Articles Across Fields of Economic Research". SSRN 2523078.
  45. ^ van Wesel, M.; Wyatt, S.; ten Haaf, J. (2014). "What a difference a colon makes: how superficial factors influence subsequent citation". Scientometrics. 98 (3): 1601–1615. doi:10.1007/s11192-013-1154-x.
  46. ^ van Wesel, M. (2016). "Evaluation by Citation: Trends in Publication Behavior, Evaluation Criteria, and the Strive for High Impact Publications". Science and Engineering Ethics. 22 (1): 199–225. doi:10.1007/s11948-015-9638-0. PMC 4750571.
  47. ^ Giles, C. L.; Bollacker, K.; Lawrence, S. (1998). "CiteSeer: An Automatic Citation Indexing System". DL'98 Digital Libraries, 3rd ACM Conference on Digital Libraries. pp. 89–98. doi:10.1145/276675.276685.
  48. ^ Yu, G.; Li, Y.-J. (2010). "Identification of referencing and citation processes of scientific journals based on the citation distribution model". Scientometrics. 82 (2): 249–261. doi:10.1007/s11192-009-0085-z.
  49. ^ Bouabid, H. (2011). "Revisiting citation aging: A model for citation distribution and life-cycle prediction". Scientometrics. 88 (1): 199–211. doi:10.1007/s11192-011-0370-5.
  50. ^ Biagioli, M. (2016). "Watch out for cheats in citation game". Nature. 535 (7611): 201–201. Bibcode:2016Natur.535..201B. doi:10.1038/535201a. PMID 27411599.

Further reading

External links

CWTS Leiden Ranking

The CWTS Leiden Ranking is an annual global university ranking based exclusively on bibliometric indicators. The rankings are compiled by the Centre for Science and Technology Studies (Dutch: Centrum voor Wetenschap en Technologische Studies, CWTS) at Leiden University in the Netherlands. The Thomson Reuters bibliographic database Web of Science is used as the source of the publication and citation data.The Leiden Ranking ranks universities worldwide by number of academic publications according to the volume and citation impact of the publications at those institutions. The rankings take into account differences in language, discipline and institutional size. Multiple ranking lists are released according to various bibliometric normalization and impact indicators, including the number of publications, citations per publication, and field-normalized impact per publication. In addition to citation impact, the Leiding Ranking also ranks universities by scientific collaboration, including collaboration with other institutions and collaboration with an industry partner.The first edition of the Leiden Ranking was produced in 2007. The 2014 rankings include 750 universities worldwide, which were selected based on the number of articles and reviews published by authors affiliated with those institutions in 2009–2012 in so-called "core" journals, a set of English-language journals with international scope and a "sufficiently large" number of references in the Web of Science database.

CiteSeerX

CiteSeerx (originally called CiteSeer) is a public search engine and digital library for scientific and academic papers, primarily in the fields of computer and information science. CiteSeer holds a United States patent # 6289342, titled "Autonomous citation indexing and literature browsing using citation context," granted on September 11, 2001. Stephen R. Lawrence, C. Lee Giles, Kurt D. Bollacker are the inventors of this patent assigned to NEC Laboratories America, Inc. This patent was filed on May 20, 1998, which has its roots (Priority) to January 5, 1998. A continuation patent was also granted to the same inventors and also assigned to NEC Labs on this invention i.e. US Patent # 6738780 granted on May 18, 2004 and was filed on May 16, 2001. CiteSeer is considered as a predecessor of academic search tools such as Google Scholar and Microsoft Academic Search. CiteSeer-like engines and archives usually only harvest documents from publicly available websites and do not crawl publisher websites. For this reason, authors whose documents are freely available are more likely to be represented in the index.

CiteSeer's goal is to improve the dissemination and access of academic and scientific literature. As a non-profit service that can be freely used by anyone, it has been considered as part of the open access movement that is attempting to change academic and scientific publishing to allow greater access to scientific literature. CiteSeer freely provided Open Archives Initiative metadata of all indexed documents and links indexed documents when possible to other sources of metadata such as DBLP and the ACM Portal. To promote open data, CiteSeerx shares its data for non-commercial purposes under a Creative Commons license.The name can be construed to have at least two explanations. As a pun, a 'sightseer' is a tourist who looks at the sights, so a 'cite seer' would be a researcher who looks at cited papers. Another is a 'seer' is a prophet and a 'cite seer' is a prophet of citations. CiteSeer changed its name to ResearchIndex at one point and then changed it back.

Clarivate Citation Laureates

Clarivate Citation Laureates formerly Thomson Reuters Citation Laureates is a list of candidates considered likely to win the Nobel Prize in their respective field. The candidates are so named based on the citation impact of their published research. The list of awardees is announced annually prior to the Nobel Prize ceremonies of that year. In October 2016, Thomson Reuters Intellectual Property and Science Business was acquired by Onex and Baring Asia and the newly independent company was named as Clarivate Analytics.

Courant Institute of Mathematical Sciences

The Courant Institute of Mathematical Sciences (CIMS) is an independent division of New York University (NYU) under the Faculty of Arts & Science that serves as a center for research and advanced training in computer science and mathematics. It is considered one of the leading and most prestigious mathematics schools and mathematical sciences research centers in the world. It is named after Richard Courant, one of the founders of the Courant Institute and also a mathematics professor at New York University from 1936 to 1972.

It is ranked #1 in applied mathematical research in US, #5 in citation impact worldwide, and #12 in citation worldwide. On the Faculty Scholarly Productivity Index, it is ranked #3 with an index of 1.84. It is also known for its extensive research in pure mathematical areas, such as partial differential equations, probability and geometry, as well as applied mathematical areas, such as computational biology, computational neuroscience, and mathematical finance. The Mathematics Department of the Institute has 18 members of the United States National Academy of Sciences (more than any other mathematics department in the U.S.) and five members of the National Academy of Engineering. Four faculty members have been awarded the National Medal of Science, one was honored with the Kyoto Prize, and nine have received career awards from the National Science Foundation. Courant Institute professors Peter Lax, S. R. Srinivasa Varadhan, Mikhail Gromov, Louis Nirenberg won the 2005, 2007, 2009 and 2015 Abel Prize respectively for their research in partial differential equations, probability and geometry. Louis Nirenberg also received the Chern Medal in 2010, and Subhash Khot won the Nevanlinna Prize in 2014.

The Director of the Courant Institute directly reports to New York University's Provost and President and works closely with deans and directors of other NYU colleges and divisions respectively. The undergraduate programs and graduate programs at the Courant Institute are run independently by the Institute, and formally associated with the NYU College of Arts and Science and NYU Graduate School of Arts and Science respectively.

Douglas Diamond

Douglas Warren Diamond (born 1953) is the Merton H. Miller Distinguished Service Professor of Finance at the University of Chicago Booth School of Business. He specializes in the study of financial intermediaries, financial crises, and liquidity. He is a former president of the American Finance Association. and the Western Finance Association, a fellow of the Econometric Society, the American Academy of Arts and Sciences, and the American Finance Association.

Diamond is best known for his work on financial crises and bank runs, particularly the influential Diamond–Dybvig model published in 1983. He was listed by Thomson Reuters as one of the "researchers likely to be in contention for Nobel honours based on the citation impact of their published research". In 2016, he was awarded the CME Group-MSRI Prize in Innovative Quantitative Applications.

Duplicate publication

Duplicate publication, multiple publication, or redundant publication refers to publishing the same intellectual material more than once, by the author or publisher. It does not refer to the unauthorized republication by someone else, which constitutes plagiarism, copyright violation, or both.

Multiple submission is not plagiarism, but it is today often viewed as academic misbehavior because it can skew meta-analyses and review articles and can distort citation indexes and citation impact by gaming the system to a degree. It was not always looked upon as harshly, as it began centuries ago and, besides the negative motive of vanity which has always been possible, it also had a legitimate motive in reaching readerships of various journals and books that were at real risk of not otherwise overlapping.

This was a print-only era before modern discoverability via the internet and digital search and before systematic reviews, meta-analyses, and citation indexes existed, and despite a few rudimentary journal clubs, it was likely for readers who subscribed to journals in one city, region, or specialty to have only sporadic contact with journals from other places or specialties. Thus redundant publication could serve a valid purpose analogous to the way that various newspapers in different cities and countries often report news items from elsewhere, ensuring that people in many places receive them despite that they do not read multiple periodicals from many other places. However, as discoverability increased in the 20th century and the aforementioned concerns arose, critical views of redundant publication, beyond merely reproaching vanity, took shape.

A formalization of the policy of disallowing duplicate publications was given by Franz J. Ingelfinger, the editor of The New England Journal of Medicine, in 1969. He coined the Ingelfinger rule term banning republications in the journal. Most journals follow this policy today. The BMJ, for example, requires copies of any previous work with more than 10% overlap of a submission to be submitted before approving a work for publication. However, there is at least one form of publishing the same article in multiple journals that is still widely accepted, which is that some medical societies that issue joint medical guidelines will copublish those guidelines in both of the societies' official journals; for example, joint guidelines by the American Heart Association and the American College of Cardiology are usually published in both Circulation and the Journal of the American College of Cardiology. This type of dual publication is analogous to co-editions of a book.

With the advancement of the internet, there are now several tools available to aid in the detection of plagiarism and multiple publications within biomedical literature. One tool developed in 2006 by researchers in Harold Garner's laboratory at University of Texas Southwestern Medical Center at Dallas was Déjà Vu, an open-access database containing several thousand instances of duplicate publication.

Journals sometimes choose to republish seminal articles, whether from their own past volumes, from other journals, or both. Re-publication serves the goal of bringing important information to new readerships, which makes it analogous to some instances of duplicate publication on that score. However, it is different from duplicate publication in the respect that there is no element of merely gaming the system of citation impact. Republished articles are clearly labeled as such, allowing them to be recognized as such in citation analysis.

Emory University

Emory University is a private research university in Atlanta, Georgia. The university was founded as Emory College in 1836 in Oxford, Georgia, by the Methodist Episcopal Church and was named in honor of Methodist bishop John Emory. In 1915, Emory College moved to its present location in Druid Hills and was rechartered as Emory University. Emory maintained a presence in Oxford that eventually became Oxford College, a residential liberal arts college for the first two years of the Emory baccalaureate degree. The university is the second-oldest private institution of higher education in Georgia and among the fifty oldest private universities in the United States.Emory University has nine academic divisions: Emory College of Arts and Sciences, Oxford College, Goizueta Business School, Laney Graduate School, School of Law, School of Medicine, Nell Hodgson Woodruff School of Nursing, Rollins School of Public Health, and the Candler School of Theology. Emory University, the Georgia Institute of Technology, and Peking University in Beijing, China jointly administer the Wallace H. Coulter Department of Biomedical Engineering. The university operates the Confucius Institute in Atlanta in partnership with Nanjing University. Emory has a growing faculty research partnership with the Korea Advanced Institute of Science and Technology (KAIST). Emory University students come from all 50 states, 6 territories of the United States, and over 100 foreign countries.Emory Healthcare is the largest healthcare system in the state of Georgia and comprises seven major hospitals, including the internationally renowned Emory University Hospital and Emory University Hospital Midtown. The university operates the Winship Cancer Institute, Yerkes National Primate Research Center, and many disease and vaccine research centers. Emory University is the leading coordinator of the U.S. Health Department's National Ebola Training and Education Center. The university is one of four institutions involved in the NIAID's Tuberculosis Research Units Program. The International Association of National Public Health Institutes is headquartered at the university and the Centers for Disease Control and Prevention and the American Cancer Society are national affiliate institutions located adjacent to the campus. The university is partnered with the Nobel Peace Prize winning Carter Center.Emory University has the 16th largest endowment among the list of colleges and universities in the United States by endowment. It is ranked 21st nationally and 71st globally according to U.S. News & World Report's 2018 rankings. Emory University has a Carnegie Classification of Institutions of Higher Education status of R1: "highest research activity" and is cited for high scientific performance and citation impact in the CWTS Leiden Ranking. The National Science Foundation ranked the university 36th among academic institutions in the United States for research and development (R&D) expenditures. Emory University research is funded primarily by federal government agencies, namely the National Institutes of Health (NIH). In 1995 Emory University was elected to the Association of American Universities, an association of the 62 leading research universities in the United States & Canada.Emory has many distinguished alumni and affiliates, including 2 Prime Ministers, 9 University Presidents, 11 members of the United States Congress, 2 Nobel Peace Prize laureates, a Vice President of the United States, a United States Speaker of the House, and a United States Supreme Court Justice. Other notable alumni include Rhodes Scholars, 6 Pulitzer Prize winners, Emmy Award winners, MacArthur Fellows, CEOs of Fortune 500 companies, heads of state and other leaders in foreign government, academics, musicians, and an Olympic medalist. Emory has more than 149,000 alumni, with 75 alumni clubs established worldwide in 20 countries.

Global Open Access Forum

The Global Open Access List (GOAL), until January 2012 the American Scientist Open Access Forum, is the longest-standing online discussion forum on Open Access (free online access to peer-reviewed research). It was created by the American Scientist, which is published by Sigma Xi, in September 1998, before the term "Open Access" (OA) was coined, and it was originally called the "September98-Forum." Its first focus was an article published in American Scientist in which Thomas J Walker of the University of Florida proposed that journals should furnish free online access out of the fees authors pay them to purchase reprints. Stevan Harnad, who had in 1994 made the Subversive Proposal that all researchers should self archive their peer-reviewed research, was invited to moderate the forum, which was not expected to last more than a few months. It continued to grow in size and influence across the years and is still the site where most of the main developments in OA are first mooted, including self-archiving, institutional repositories, citation impact, research performance metrics, publishing reform, copyright reform, open access journals, and open access mandates.

H-index

The h-index is an author-level metric that attempts to measure both the productivity and citation impact of the publications of a scientist or scholar. The index is based on the set of the scientist's most cited papers and the number of citations that they have received in other publications. The index can also be applied to the productivity and impact of a scholarly journal as well as a group of scientists, such as a department or university or country. The index was suggested in 2005 by Jorge E. Hirsch, a physicist at UC San Diego, as a tool for determining theoretical physicists' relative quality and is sometimes called the Hirsch index or Hirsch number.

Impact factor

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information. Impact factors are calculated yearly starting from 1975 for journals listed in the Journal Citation Reports.

Journal ranking

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

Karlsruhe Institute of Technology

The Karlsruhe Institute of Technology (KIT) (German: Karlsruher Institut für Technologie) is a public research university and one of the largest research and educational institutions in Germany. KIT was created in 2009 when the University of Karlsruhe (Universität Karlsruhe), founded in 1825 as a public research university and also known as the "Fridericiana", merged with the Karlsruhe Research Center (Forschungszentrum Karlsruhe), which had originally been established in 1956 as a national nuclear research center (Kernforschungszentrum Karlsruhe, or KfK) .KIT is one of the leading universities for engineering and the natural sciences in Europe, ranking sixth overall in citation impact. KIT is a member of the TU9 German Institutes of Technology e.V. As part of the German Universities Excellence Initiative KIT was awarded an excellence status in 2006. In the 2011 performance ranking of scientific papers, Karlsruhe ranked first in Germany and among the top ten universities in Europe in engineering and natural sciences. Ranked 26th worldwide in computer science in the internationally recognized "Times Higher Education" ranking, KIT is among the leading universities worldwide in computer science.

As of 2018, six Nobel laureates are affiliated with KIT. The Karlsruhe Institute of Technology is well known for many inventors and entrepreneurs who studied or taught there, including Heinrich Hertz, Karl Friedrich Benz and the founders of SAP SE.

Nova Science Publishers

Nova Science Publishers is an academic publisher of books, encyclopedias, handbooks, e-books and journals, based in Hauppauge, New York. It was founded in 1985 in New York by Frank Columbus, former senior editor of Plenum Publishing, whose wife, Nadya Columbus, took over upon his death in 2010. While the firm publishes works in several fields of academia, most of its publications cover the fields of science, social science, and medicine. As of February 2018, it listed 100 currently published journals.

In 2018 NOVA was ranked on the 13th place global main publishers of political sciences during the last 5 years.

Nova Science Publishers is included in the Book Citation Index.

In terms of number of books published from 2005 to 2012, Nova ranked 4th. They ranked in the top three in 8 of 14 scientific fields (engineering, clinical medicine, human biology, animal and plant biology, geosciences, social science medicine, health, chemistry, physics, and astronomy), and ranked as the 5th most prolific book publisher from 2009-2013, ranking 3rd in Engineering and Technology and 2nd in Science by numbers of books published. They were also ranked as the 6th most productive publisher, according to a University of Granada study. However, Nova had the lowest citation impact among the five most prolific publishers in both fields.

Open access

Open access (OA) is a mechanism by which research outputs are distributed online, free of cost or other barriers, and, in its most precise meaning, with the addition of an open license applied to promote reuse.Academic articles (as historically seen in print-based academic journals) have been the main focus of the movement. Conventional (non-open access) journals cover publishing costs through access tolls such as subscriptions, site licenses or pay-per-view charges. Open access can be applied to all forms of published research output, including peer-reviewed and non peer-reviewed academic journal articles, conference papers, theses, book chapters, and monographs.

ResearchGate

ResearchGate is a social networking site for scientists and researchers to share papers, ask and answer questions, and find collaborators. According to a study by Nature and an article in Times Higher Education, it is the largest academic social network in terms of active users, although other services have more registered users and more recent data suggests that almost as many academics have Google Scholar profiles.While reading articles does not require registration, people who wish to become site members need to have an email address at a recognized institution or to be manually confirmed as a published researcher in order to sign up for an account. Members of the site each have a user profile and can upload research output including papers, data, chapters, negative results, patents, research proposals, methods, presentations, and software source code. Users may also follow the activities of other users and engage in discussions with them. Users are also able to block interactions with other users.

The site has been criticized for sending unsolicited email invitations to coauthors of the articles listed on the site that were written to appear as if the email messages were sent by the other coauthors of the articles (a practice the site said it has discontinued as of November 2016) and for automatically generating apparent profiles for non-users who have sometimes felt misrepresented by them. A study found that over half of the uploaded papers appear to infringe copyright, because the authors uploaded the publisher's version.

Self-archiving

Self-archiving is the act of (the author's) depositing a free copy of an electronic document online in order to provide open access to it. The term usually refers to the self-archiving of peer-reviewed research journal and conference articles, as well as theses and book chapters, deposited in the author's own institutional repository or open archive for the purpose of maximizing its accessibility, usage and citation impact. The term green open access has become common in recent years, distinguishing this approach from gold open access, where the journal itself makes the articles publicly available without charge to the reader.

UCSB College of Engineering

The College of Engineering (CoE) is one of the three undergraduate colleges at the University of California, Santa Barbara.

As of 2015, there were 150 faculty, 1,450 undergraduate students, and 750 graduate students. According to the Leiden Ranking, engineering and physical sciences at UCSB is ranked #1 among public universities for top 10% research citation impact. According to the National Research Council rankings, the UCSB engineering graduate research program in Materials was ranked #1 and Chemical Engineering ranked #5 in the nation among public universities.

University Ranking by Academic Performance

The University Ranking by Academic Performance, abbreviated as URAP, was developed in the Informatics Institute of Middle East Technical University. Since 2010, it has been publishing annual national and global college and university rankings for top 2000 institutions. The scientometrics measurement of URAP is based on data obtained from the Institute for Scientific Information via Web of Science and inCites. For global rankings, URAP employs indicators of research performance including the number of articles, citation, total documents, article impact total, citation impact total, and international collaboration. In addition to global rankings, URAP publishes regional rankings for universities in Turkey using additional indicators such as the number of students and faculty members obtained from Center of Measuring, Selection and Placement ÖSYM.

WebCite

WebCite is an on-demand archiving service, designed to digitally preserve scientific and educationally important material on the web by making snapshots of Internet contents as they existed at the time when a blogger, or a scholar or a Wikipedia editor cited or quoted from it. The preservation service enables verifiability of claims supported by the cited sources even when the original web pages are being revised, removed, or disappear for other reasons, an effect known as link rot.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.