Altmetrics

In scholarly and scientific publishing, altmetrics are non-traditional bibliometrics[2] proposed as an alternative[3] or complement[4] to more traditional citation impact metrics, such as impact factor and h-index.[5] The term altmetrics was proposed in 2010,[1] as a generalization of article level metrics,[6] and has its roots in the #altmetrics hashtag. Although altmetrics are often thought of as metrics about articles, they can be applied to people, journals, books, data sets, presentations, videos, source code repositories, web pages, etc. Altmetrics use public APIs across platforms to gather data with open scripts and algorithms. Altmetrics did not originally cover citation counts,[7] but calculate scholar impact based on diverse online research output, such as social media, online news media, online reference managers and so on.[8][9] It demonstrates both the impact and the detailed composition of the impact.[1] Altmetrics could be applied to research filter,[1] promotion and tenure dossiers, grant applications[10][11] and for ranking newly-published articles in academic search engines.[12]

Altmetrics
The original logotype from the Altmetrics Manifesto.[1]

Adoption

The development of web 2.0 has changed the research publication seeking and sharing within or outside the academy, but also provides new innovative constructs to measure the broad scientific impact of scholar work. Although the traditional metrics are useful, they might be insufficient to measure immediate and uncited impacts, especially outside the peer-review realm.[1]

Projects such as ImpactStory,[13][14] and various companies, including Altmetric,[13][15] and Plum Analytics[13][16][17][18] are calculating altmetrics. Several publishers have started providing such information to readers, including BioMed Central, Public Library of Science (PLOS),[19][20] Frontiers,[21] Nature Publishing Group,[22] and Elsevier.[23][24]

In 2008, the Journal of Medical Internet Research started to systematically collect tweets about its articles.[25] Starting in March 2009, the Public Library of Science also introduced article-level metrics for all articles.[19][20][26] Funders have started showing interest in alternative metrics,[27] including the UK Medical Research Council.[28] Altmetrics have been used in applications for promotion review by researchers.[29] Furthermore, several universities, including the University of Pittsburgh are experimenting with altmetrics at an institute level.[29]

However, it is also observed that an article needs little attention to jump to the upper quartile of ranked papers,[30] suggesting that not enough sources of altmetrics are currently available to give a balanced picture of impact for the majority of papers.

Important in determining the relative impact of a paper, a service that calculates altmetrics statistics needs a considerably sized knowledge base. The following table shows the number of papers covered by services (as of 2016):

Website Number of papers
Plum Analytics ~ 29.7 million[31]
Altmetric.com > 5 Million[32]
ImpactStory ~ 1 Million[33]

Categories

Altmetrics are a very broad group of metrics, capturing various parts of impact a paper or work can have. A classification of altmetrics was proposed by ImpactStory in September 2012,[34] and a very similar classification is used by the Public Library of Science:[35]

  • Viewed – HTML views and PDF downloads
  • Discussed – journal comments, science blogs, Wikipedia, Twitter, Facebook and other social media
  • Saved – Mendeley, CiteULike and other social bookmarks
  • Cited – citations in the scholarly literature, tracked by Web of Science, Scopus, CrossRef and others
  • Recommended – for example used by F1000Prime[36]

Viewed

One of the first alternative metrics to be used was the number of views of a paper. Traditionally, an author would wish to publish in a journal with a high subscription rate, so many people would have access to the research. With the introduction of web technologies it became possible to actually count how often a single paper was looked at. Typically, publishers count the number of HTML views and PDF views. As early as 2004, the BMJ published the number of views for its articles, which was found to be somewhat correlated to citations.[37]

Discussed

The discussion of a paper can be seen as a metric that captures the potential impact of a paper. Typical sources of data to calculate this metric include Facebook, Google+, Twitter, Science Blogs, and Wikipedia pages. Some researchers regard the mentions on social media as citations. For example, citations on a social media platform could be divided into two categories: internal and external. For instance, the former includes retweets, the latter refers to tweets containing links to outside documents.[38] The correlation between the mentions and likes and citation by primary scientific literature has been studied, and a slight correlation at best was found, e.g. for articles in PubMed.[4] In 2008 the Journal of Medical Internet Research began publishing views and tweets. These "tweetations" proved to be a good indicator of highly cited articles, leading the author to propose a "Twimpact factor", which is the number of Tweets it receives in the first seven days of publication, as well as a Twindex, which is the rank percentile of an article's Twimpact factor.[25] However, if implementing use of the Twimpact factor, research shows scores to be highly subject specific, and as a result, comparisons of Twimpact factors should be made between papers of the same subject area.[25] It is necessary to note that while past research in the literature has demonstrated a correlation between tweetations and citations, it is not a causative relationship. At this point in time, it is unclear whether higher citations occur as a result of greater media attention via twitter and other platforms, or is simply reflective of the quality of the article itself.[25]

Recent research conducted at the individual level, rather than the article level, supports the use of twitter and social media platforms as a mechanism for increasing impact value.[39] Results indicate that researchers whose work is mentioned on twitter have significantly higher h-indices than those of researchers whose work was not mentioned on twitter. The study highlights the role of using discussion based platforms, such as twitter, in order to increase the value of traditional impact metrics.

Besides Twitter and other streams, blogging has shown to be a powerful platform to discuss literature. Various platforms exist that keep track of which papers are being blogged about. Altmetric.com uses this information for calculating metrics, while other tools just report where discussion is happening, such as ResearchBlogging and Chemical blogspace.

Recommended

Platforms may even provide a formal way of ranking papers or recommending papers otherwise, such as Faculty of 1000.[40]

Saved

It is also informative to quantify the number of times a page has been saved, or bookmarked. It is thought that individuals typically choose to bookmark pages that have a high relevance to their own work, and as a result, bookmarks may be an additional indicator of impact for a specific study. Providers of such information include science specific social bookmarking services such as CiteULike and Mendeley.

Cited

The cited category is a narrowed definition, different from the discussion. Besides the traditional metrics based on citations in scientific literature, such as those obtained from Google Scholar, CrossRef, PubMed Central, and Scopus, altmetrics also adopt citations in secondary knowledge sources. For example, ImpactStory counts the number of times a paper has been referenced by Wikipedia.[41] Plum Analytics also provides metrics for various academic publications,[42] seeking to track research productivity. PLOS is also a tool that may be used to utilize information on engagement.[42]

Interpretation

While there is less consensus on the validity and consistency of altmetrics,[43] the interpretation of altmetrics in particular is discussed. Proponents of altmetrics make clear that many of the metrics show attention or engagement, rather than the quality of impacts on the progress of science.[35] It should be noted that even citation-based metrics do not indicate if a high score implies a positive impact on science; that is, papers are also cited in papers that disagree with the cited paper, an issue for example addressed by the Citation Typing Ontology project.[44]

Altmetrics could be more appropriately interpreted by providing detailed context and qualitative data. For example, in order to evaluate the scientific contribution of a scholar work to policy making by altmetrics, qualitative data, such as who's citing online[12] and to what extent the online citation is relevant to the policymaking, should be provided as evidence.[45]

Regarding the relatively low correlation between traditional metrics and altmetrics, altmetrics might measure complementary perspectives of the scholar impact. It is reasonable to combine and compare the two types of metrics in interpreting the societal and scientific impacts. Researchers built a 2*2 framework based on the interactions between altmetrics and traditional citations.[4] Further explanations should be provided for the two groups with high altmetrics/low citations and low altmetrics/high citations.[25][4] Thus, altmetrics provide convenient approaches for researchers and institutions to monitor the impact of their work and avoid inappropriate interpretations.

Controversy

The usefulness of metrics for estimating scientific impact is controversial.[46][47] Research has found that online buzz could amplify the effect of other forms of outreach on researchers' scientific impact. For the nano-scientists that are mentioned on Twitter, their interactions with reporters and non-scientists positively and significantly predicted higher h-index, whereas the non-mentioned group failed.[39] Altmetrics expands the measurement of scholar impact for containing a rapid uptake, a broader range of audiences and diverse research outputs. In addition, the community shows a clear need: funders demand measurables on the impact of their spending, such as public engagement.

However, there are limitations that affect the usefulness due to technique problems and systematic bias of construct, such as data quality, heterogeneity and particular dependencies.[46] In terms of technique problems, the data might be incomplete, because it is difficult to collect those online research outputs without direct links to their mentions (i.e. videos) and identify different versions of one research work. Additionally, whether the API leads to any missing data is unsolved.[4]

As for systematic bias, like other metrics, altmetrics are prone to self-citation, gaming, and other mechanisms to boost one's apparent impact. Altmetrics can be gamed: for example, likes and mentions can be bought.[48] Altmetrics can be more difficult to standardize than citations. One example is the number of tweets linking to a paper where the number can vary widely depending on how the tweets are collected.[49] Besides, online popularity may not equal to scientific values. Some popular online citations might be far from the value of generating further research discoveries, while some theoretical-driven or minority-targeted research of great science-related importance might be marginalized online.[25] For example, the top tweeted articles in biomedicine in 2011 were relevant to curious or funny content, potential health applications, and catastrophe.[4]

Altmetrics for more recent articles may be higher because of the increasing uptake of the social web and because articles may be mentioned mainly when they are published.[50] As a result, it might not be fair to compare the altmetrics scores of articles unless they have been published at a similar time. Researchers has developed a sign test to avoid the usage uptake bias by comparing the metrics of an article with the two articles published immediately before and after it.[50]

It should be kept in mind that the metrics are only one of the outcomes of tracking how research is disseminated and used. Altmetrics should be carefully interpreted to overcome the bias. Even more informative than knowing how often a paper is cited, is which papers are citing it. That information allows researchers to see how their work is impacting the field (or not). Providers of metrics also typically provide access to the information from which the metrics were calculated. For example, Web of Science shows which are the citing papers, ImpactStory shows which Wikipedia pages are referencing the paper, and CitedIn shows which databases extracted data from the paper.[51]

Another concern of altmetrics, or any metrics, is how universities or institutions are using metrics to rank their employees make promotion or funding decisions,[52] and the aim should be limited to measure engagement.[53]

The overall online research output is very little and varied among different disciplines.[25][4] The phenomenon might be consistent with the social media use among scientists. Surveys has shown that nearly half of their respondents held ambivalent attitudes of social media's influence on academic impact and never announced their research work on social media.[54] With the changing shift in open science and social media use, the consistent altmetrics across disciplines and institutions will more likely be adopted.

Ongoing research

The specific use cases and characteristics is an active research field in bibliometrics, providing much needed data to measure the impact of altmetrics itself. Public Library of Science has an Altmetrics Collection[55] and both the Information Standards Quarterly and the Aslib Journal of Information Management recently published special issues on altmetrics.[56][57] A series of articles that extensively reviews altmetrics was published in late 2015.[58][59][60]

There is other research examining the validity of one altmetrics[4][25] or make comparisons across different platforms.[50] Researchers examine the correlation between altmetrics and traditional citations as the validity test. They assume that the positive and significant correlation reveals the accuracy of altmetrics to measure scientific impact as citations.[50] The low correlation (less than 0.30[4]) leads to the conclusion that altmetrics serves a complementary role in scholar impact measurement. However, it remains unsolved that what altmetrics are most valuable and what degree of correlation between two metrics generates a stronger impact on the measurement. Additionally, the validity test itself faces some technical problems as well. For example, replication of the data collection is impossible because of the instant changing algorithms of data providers.[61]

See also

References

  1. ^ a b c d e Priem, Jason; Taraborelli, Dario; Groth, Paul; Neylon, Cameron (September 28, 2011). "Altmetrics: A manifesto (v 1.01)". Altmetrics.
  2. ^ "PLOS Collections". Public Library of Science (PLOS). Altmetrics is the study and use of non-traditional scholarly impact measures that are based on activity in web-based environments
  3. ^ "The "alt" does indeed stand for "alternative"" Jason Priem, leading author in the Altmetrics Manifesto -- see comment 592
  4. ^ a b c d e f g h i Haustein, Stefanie; Peters, Isabella; Sugimoto, Cassidy R.; Thelwall, Mike; Larivière, Vincent (2014-04-01). "Tweeting biomedicine: An analysis of tweets and citations in the biomedical literature". Journal of the Association for Information Science and Technology. 65 (4): 656–669. arXiv:1308.1838. doi:10.1002/asi.23101. ISSN 2330-1643.
  5. ^ Chavda, Janica; Patel, Anika (30 December 2015). "Measuring research impact: bibliometrics, social media, altmetrics, and the BJGP". British Journal of General Practice. 66 (642): e59–e61. doi:10.3399/bjgp16X683353. PMC 4684037. PMID 26719483.
  6. ^ Binfield, Peter (9 November 2009). "Article-Level Metrics at PLoS - what are they, and why should you care?" (Video). University of California, Berkeley.
  7. ^ Bartling, Sönke; Friesike, Sascha (2014). Opening Science: The Evolving Guide on How the Internet Is Changing Research, Collaboration and Scholarly Publishing. Cham: Springer International Publishing. p. 181. doi:10.1007/978-3-319-00026-8. ISBN 978-3-31-900026-8. OCLC 906269135. Altmetrics and article-level metrics are sometimes used interchangeably, but there are important differences: article-level metrics also include citations and usage data; ...
  8. ^ Mcfedries, Paul (August 2012). "Measuring the impact of altmetrics [Technically Speaking]". IEEE Spectrum. 49 (8): 28. doi:10.1109/MSPEC.2012.6247557. ISSN 0018-9235.
  9. ^ Galligan, Finbar; Dyas-Correia, Sharon (March 2013). "Altmetrics: Rethinking the Way We Measure". Serials Review. 39 (1): 56–61. doi:10.1016/j.serrev.2013.01.003.
  10. ^ Moher, David; Naudet, Florian; Cristea, Ioana A.; Miedema, Frank; Ioannidis, John P. A.; Goodman, Steven N. (2018-03-29). "Assessing scientists for hiring, promotion, and tenure". PLOS Biology. 16 (3): e2004089. doi:10.1371/journal.pbio.2004089. ISSN 1545-7885. PMC 5892914. PMID 29596415.
  11. ^ Rajiv, Nariani (2017-03-24). "Supplementing Traditional Ways of Measuring Scholarly Impact: The Altmetrics Way". hdl:10315/33652.
  12. ^ a b Mehrazar, Maryam; Kling, Christoph Carl; Lemke, Steffen; Mazarakis, Athanasios; Peters, Isabella (2018-04-08). "Can We Count on Social Media Metrics? First Insights into the Active Scholarly Use of Social Media". Proceedings of the 10th ACM Conference on Web Science. p. 215. arXiv:1804.02751. doi:10.1145/3201064.3201101. ISBN 9781450355636.
  13. ^ a b c Liu, Jean; Euan Adie (8 July 2013). "New perspectives on article-level metrics: developing ways to assess research uptake and impact online". Insights. 26 (2): 153. doi:10.1629/2048-7754.79.
  14. ^ "Impactstory: About". ImpactStory.
  15. ^ "Altmetric: About us". Altmetric. 2015-06-02.
  16. ^ Lindsay, J. Michael (15 April 2016). "PlumX from Plum Analytics: Not Just Altmetrics". Journal of Electronic Resources in Medical Libraries. 13 (1): 8–17. doi:10.1080/15424065.2016.1142836.
  17. ^ "Plum Analytics: About Us". Plum Analytics.
  18. ^ "Plum Analytics: About Altmetrics". Plum Analytics.
  19. ^ a b Fenner, Martin (1 July 2005). "Article-Level Metrics Information". Lagotto. Archived from the original on 22 September 2009.
  20. ^ a b "A Comprehensive Assessment of Impact with Article-Level Metrics (ALMs)". Public Library of Science (PLOS).
  21. ^ "About Frontiers: Academic Journals and Research Community". Frontiers.
  22. ^ Baynes, Grace (25 October 2012). "Article level metrics on nature.com". Nature.
  23. ^ Reller, Tom (15 July 2013). "Elsevier Announces 2012 Journal Impact Factor Highlights". MarketWatch.
  24. ^ Beatty, Susannah (29 July 2015). "New Scopus Article Metrics: A better way to benchmark articles | Elsevier Scopus Blog". Scopus.
  25. ^ a b c d e f g h Eysenbach, G (19 December 2011). "Can tweets predict citations? Metrics of social impact based on Twitter and correlation with traditional metrics of scientific impact". Journal of Medical Internet Research. 13 (4): e123. doi:10.2196/jmir.2012. PMC 3278109. PMID 22173204.
  26. ^ Fenner, Martin. "Public Library of Science (PLOS)". Lagotto.
  27. ^ Piwowar, Heather (9 January 2013). "Altmetrics: Value all research products". Nature. 493 (159): 159. Bibcode:2013Natur.493..159P. doi:10.1038/493159a. PMID 23302843.
  28. ^ Viney, Ian (13 February 2013). "Altmetrics: Research council responds". Nature. 494 (7436): 176. Bibcode:2013Natur.494..176V. doi:10.1038/494176c. PMID 23407530.
  29. ^ a b Kwok, Roberta (21 August 2013). "Research impact: Altmetrics make their mark". Nature. 500 (7463): 491–493. doi:10.1038/nj7463-491a.
  30. ^ Kelly, Joel (22 August 2013). "Altmetric rankings". Infiniflux.
  31. ^ "Plum Analytics: Coverage". Retrieved 31 March 2017.
  32. ^ Altmetric Engineering (2016). "Altmetric: the story so far". Figshare (Data Set). doi:10.6084/m9.figshare.2812843.v1.
  33. ^ @Impactstory (14 May 2016). "As of today, we're now tracking #altmetrics on a cool one million publications! #andGrowingFast". Twitter.
  34. ^ "A new framework for altmetrics". ImpactStory Blog. 2012-09-14.
  35. ^ a b Lin, J.; Fenner, M. (2013). "Altmetrics in Evolution: Defining and Redefining the Ontology of Article-Level Metrics". Information Standards Quarterly. 25 (2): 20. doi:10.3789/isqv25no2.2013.04.
  36. ^ F1000Prime
  37. ^ Perneger, T. V (2004). "Relation between online "hit counts" and subsequent citations: Prospective study of research papers in the BMJ". BMJ. 329 (7465): 546–7. doi:10.1136/bmj.329.7465.546. PMC 516105. PMID 15345629.
  38. ^ Weller, Katrin; Peters, Isabella (2012). Tokar, Alexander; Beurskens, Michael; Keuneke, Susanne; Mahrt, Merja; Peters, Isabella; Puschmann, Cornelius; Treeck, Timo van; Weller, Katrin, eds. Citations in Web 2.0. Düsseldorf Univ. Press. pp. 209–222. ISBN 9783943460162.
  39. ^ a b Liang, Xuan (2014). "Building Buzz: (Scientists) Communicating Science in New Media Environments". Journalism and Mass Communication.
  40. ^ Lin, Jennifer; Fenner, Martin (2013-04-01). "The many faces of article-level metrics". Bulletin of the American Society for Information Science and Technology. 39 (4): 27–30. doi:10.1002/bult.2013.1720390409. ISSN 1550-8366.
  41. ^ "FAQ: which metrics are measured?". ImpactStory.
  42. ^ a b Papakostidis, Costas; Giannoudis, Peter V. (2018). Medical Writing and Research Methodology for the Orthopaedic Surgeon. Springer, Cham. pp. 71–79. doi:10.1007/978-3-319-69350-7_9. ISBN 9783319693491.
  43. ^ Jump, Paul (23 August 2012). "Research Intelligence - Alt-metrics: fairer, faster impact data?". Times Higher Education.
  44. ^ Shotton, D. (2010). "CiTO, the Citation Typing Ontology". Journal of Biomedical Semantics. 1 (Suppl 1): S6–S1. doi:10.1186/2041-1480-1-S1-S6. PMC 2903725. PMID 20626926.
  45. ^ "How to Use Altmetrics to Showcase Engagement Efforts for Promotion and Tenure". Altmetric. 2016-10-18. Retrieved 2018-04-12.
  46. ^ a b Mike Buschman; Andrea Michalek (April–May 2013). "Are Alternative Metrics Still Alternative?". asis&t Bulletin.
  47. ^ Cheung, M. K. (2013). "Altmetrics: Too soon for use in assessment". Nature. 494 (7436): 176. Bibcode:2013Natur.494..176C. doi:10.1038/494176d. PMID 23407528.
  48. ^ J. Beall, Article-Level Metrics: An Ill-Conceived and Meretricious Idea, 2013, "Archived copy". Archived from the original on 2013-08-06. Retrieved 2013-08-10.CS1 maint: Archived copy as title (link)
  49. ^ Chamberlain, S. (2013). "Consuming Article-Level Metrics: Observations and Lessons". Information Standards Quarterly. 25 (2): 4–13. doi:10.3789/isqv25no2.2013.02.
  50. ^ a b c d Thelwall, M.; Haustein, S.; Larivière, V.; Sugimoto, C. R. (2013). "Do Altmetrics Work? Twitter and Ten Other Social Web Services". PLoS ONE. 8 (5): e64841. Bibcode:2013PLoSO...864841T. doi:10.1371/journal.pone.0064841. PMC 3665624. PMID 23724101.
  51. ^ Waagmeester, A.; Evelo, C. (2011). "Measuring impact in online resources with the CInumber (the CitedIn Number for online impact)". Nature Precedings. doi:10.1038/npre.2011.6037.1.
  52. ^ David Colquhoun, How should universities be run to get the best out of people?, 2007
  53. ^ Matthews, David (7 October 2015). "Altmetrics risk becoming part of problem, not solution, warns academic". Times Higher Education.
  54. ^ "Reports". Science, Media and the Public. 2014-09-11. Retrieved 2018-04-12.
  55. ^ Priem, Jason; Groth, Paul; Taraborelli, Dario (2012). Ouzounis, Christos A., ed. "The Altmetrics Collection". PLoS ONE. 7 (11): e48753. Bibcode:2012PLoSO...748753P. doi:10.1371/journal.pone.0048753. PMC 3486795. PMID 23133655.
  56. ^ "Topic: Altmetrics". Information Standards Quarterly (ISQ). 25 (2). Summer 2013. doi:10.3789/isqv25no2.2013. Archived from the original on 2013-08-03. Retrieved 2013-08-10.
  57. ^ Haustein, Stefanie; Peters, Isabella; Sugimoto, Cassidy R.; Thelwall, Mike; Larivière, Vincent (2015). Haustein, Stefanie; Sugimoto, Cassidy R.; Larivière, Vincent, eds. "Social Media Metrics in Scholarly Communication: exploring tweets, blogs, likes and other altmetrics". Aslib Journal of Information Management. 67 (3). arXiv:1504.01877. doi:10.1108/ajim-03-2015-0047. ISSN 2050-3806.
  58. ^ Thelwall, Mike A.; Kousha, Kayvan (2015). "Web indicators for research evaluation, part 1: Citations and links to academic articles from the web". El Profesional de la Información. 24 (5): 587–606. doi:10.3145/epi.2015.sep.08.
  59. ^ Thelwall, Mike A.; Kousha, Kayvan (2015). "Web indicators for research evaluation, part 2: Social media metrics". El Profesional de la Información. 24 (5): 607–620. doi:10.3145/epi.2015.sep.09.
  60. ^ Kousha, Kayvan; Thelwall, Mike A. (2015). "Web indicators for research evaluation, part 3: Books and non-standard outputs". El Profesional de la Información. 24 (6): 724–736. doi:10.3145/epi.2015.nov.04.
  61. ^ Liu, Jean; Adie, Euan (2013-04-01). "Five challenges in altmetrics: A toolmaker's perspective". Bulletin of the American Society for Information Science and Technology. 39 (4): 31–34. doi:10.1002/bult.2013.1720390410. ISSN 1550-8366.

External links

Academic Medicine (journal)

Academic Medicine is the monthly peer-reviewed medical journal, published by the Association of American Medical Colleges.

Academic journal

An academic or scholarly journal is a periodical publication in which scholarship relating to a particular academic discipline is published. Academic journals serve as permanent and transparent forums for the presentation, scrutiny, and discussion of research. They are usually peer-reviewed or refereed. Content typically takes the form of articles presenting original research, review articles, and book reviews. The purpose of an academic journal, according to Henry Oldenburg (the first editor of Philosophical Transactions of the Royal Society), is to give researchers a venue to "impart their knowledge to one another, and contribute what they can to the Grand design of improving natural knowledge, and perfecting all Philosophical Arts, and Sciences."The term academic journal applies to scholarly publications in all fields; this article discusses the aspects common to all academic field journals. Scientific journals and journals of the quantitative social sciences vary in form and function from journals of the humanities and qualitative social sciences; their specific aspects are separately discussed.

The first academic journal was Journal des sçavans (January 1665), followed soon after by Philosophical Transactions of the Royal Society (March 1665), and Mémoires de l'Académie des Sciences (1666). The first fully peer-reviewed journal was Medical Essays and Observations (1733).

Altmetric

Altmetric, or altmetric.com, is a data science company that tracks where published research is mentioned online, and provides tools and services to institutions, publishers, researchers, funders and other organisations to monitor this activity, commonly referred to as altmetrics. Altmetric was recognized by European Commissioner Máire Geoghegan-Quinn in 2014 as a company challenging the traditional reputation systems.Altmetric is a portfolio company of Digital Science, which is owned by Holtzbrinck Publishing Group.

Article-level metrics

Article-level metrics are citation metrics which measure the usage and impact of individual scholarly articles.

Bibliometrics

Bibliometrics is statistical analysis of written publications, such as books or articles. Bibliometric methods are frequently used in the field of library and information science, including scientometrics. For instance, bibliometrics are used to provide quantitative analysis of academic literature or for evaluating budgetary spending. Citation analysis is a commonly used bibliometric method which is based on constructing the citation graph, a network or graph representation of the citations between documents. Many research fields use bibliometric methods to explore the impact of their field, the impact of a set of researchers, the impact of a particular paper, or to identify particularly impactful papers within a specific field of research. Bibliometrics also has a wide range of other applications, such as in descriptive linguistics, the development of thesauri, and evaluation of reader usage.

Figshare

Figshare is an online open access repository where researchers can preserve and share their research outputs, including figures, datasets, images, and videos. It is free to upload content and free to access, in adherence to the principle of open data. Figshare is one of a number of portfolio businesses supported by Digital Science.

ImpactStory

ImpactStory is an open source, web-based tool that provides altmetrics to help researchers measure the impacts of their research outputs including journal articles, blog posts, datasets, and software. It aims to change the focus of the scholarly reward system to value and encourage web-native scholarship. ImpactStory is a nonprofit organisation funded by the Alfred P. Sloan Foundation and the National Science Foundation.

Impact factor

The impact factor (IF) or journal impact factor (JIF) of an academic journal is a measure reflecting the yearly average number of citations to recent articles published in that journal. It is frequently used as a proxy for the relative importance of a journal within its field; journals with higher impact factors are often deemed to be more important than those with lower ones. The impact factor was devised by Eugene Garfield, the founder of the Institute for Scientific Information. Impact factors are calculated yearly starting from 1975 for journals listed in the Journal Citation Reports.

Journal ranking

Journal ranking is widely used in academic circles in the evaluation of an academic journal's impact and quality. Journal rankings are intended to reflect the place of a journal within its field, the relative difficulty of being published in that journal, and the prestige associated with it. They have been introduced as official research evaluation tools in several countries.

List of open-access projects

Some of the most important open-access publishing projects or lists of such projects are listed below.

Multiple Sclerosis Discovery Forum

Multiple Sclerosis Discovery Forum (MSDF) is a non-profit online resource created to speed progress toward a cure for multiple sclerosis (MS) and other demyelinating diseases by enabling faster sharing of information and free discussion among MS researchers in academia, industry, and the clinic.

Launched in April 2012, MSDF deploys science journalism as a primary tool in fostering communication and collaboration among MS researchers from all corners of the scientific enterprise. The site combines news and features with technical resources, such as a weekly editor-curated index of MS-related papers from PubMed and a database with the latest scientific and regulatory information about drugs being marketed or in the pipeline for treatment of MS. Other resources include interactive data visualizations, meetings and events, and discussion forums.

MSDF is modeled after online scientific community Alzheimer Research Forum (AlzForum). Since 1996, AlzForum has become an location for information and interaction for investigators working on age-related neurodegeneration. More recently, similar independent neutral Web-based neurology disease forums have followed, including Schizophrenia Research Forum and Pain Research Forum.As with its sister forums, all content on MSDF is provided free of charge to the research community, and editorial independence from sponsors and donors is strictly maintained. MSDF articles have unique digital object identifiers (DOI) to provide stable linking over time and to facilitate discussion and altmetrics tracking of scientific articles in social media forums, s. MSDF articles are indexed by Google News.

MSDF covers the plausible but unproven questions of whether the dozen new anti-inflammatory therapies can be deployed more effectively against disease progression, as well as the upsurge in research to understand the pathological mechanisms and treatments for progressive MS. Related demyelinating conditions include neuromyelitis optica (NMO), transverse myelitis, acute disseminated encephalomyelitis, and optic neuritis. Their misdiagnosis as MS can lead to inappropriate and even harmful therapeutic choices, such as was discovered with NMO, now clearly understand to be a different disease.

MSDF is a joint activity of Accelerated Cure Project for Multiple Sclerosis (ACP) and the MassGeneral Institute for Neurodegenerative Disease (MIND).

Open science

Open science is the movement to make scientific research (including publications, data, physical samples, and software) and its dissemination accessible to all levels of an inquiring society, amateur or professional. Open science is transparent and accessible knowledge that is shared and developed through collaborative networks. It encompasses practices such as publishing open research, campaigning for open access, encouraging scientists to practice open notebook science, and generally making it easier to publish and communicate scientific knowledge.

Open Science can be seen as a continuation of, rather than a revolution in, practices begun in the 17th century with the advent of the academic journal, when the societal demand for access to scientific knowledge reached a point at which it became necessary for groups of scientists to share resources with each other so that they could collectively do their work. In modern times there is debate about the extent to which scientific information should be shared. The conflict that led to the Open Science movement is between the desire of scientists to have access to shared resources versus the desire of individual entities to profit when other entities partake of their resources. Additionally, the status of open access and resources that are available for its promotion are likely to differ from one field of academic inquiry to another.

Plum Analytics

Plum Analytics is a Philadelphia, Pennsylvania-based altmetrics company dedicated to measuring the influence of scientific research. It was founded in 2011 by Andrea Michalek, who is its current president, and Mike Buschman. It was acquired by Elsevier in February 2017, which purchased it from EBSCO Information Services for an undisclosed amount. Its metrics were immediately incorporated into Elsevier's existing products, including Mendeley and Scopus.

ScienceOpen

ScienceOpen is an interactive discovery environment for scholarly research across all disciplines. It is freely accessible for all and offers hosting and promotional services within the platform for publishers and institutes. The organization is based in Berlin and has a technical office in Boston. It is a member of CrossRef, ORCID, the Open Access Scholarly Publishers Association, STM Association and the Directory of Open Access Journals. The company was designated as one of “10 to Watch” by research advisory firm Outsell in its report “Open Access 2015: Market Size, Share, Forecast, and Trends.”

Semantometrics

Semantometrics is a tool for evaluating research. It is functionally an extension of tools such as bibliometrics, webometrics, and altmetrics, but instead of just evaluating citations – which entails relying on outside evidence – it uses a semantic evaluation of the full text of the research paper being evaluated.

Steve Pettifer

Stephen Robert Pettifer (born September 21, 1970) is a Professor in the School of Computer Science at the University of Manchester, UK.

Webometrics

The science of webometrics (also cybermetrics) tries to measure the World Wide Web to get knowledge about the number and types of hyperlinks, structure of the World Wide Web and usage patterns. According to Björneborn and Ingwersen (2004), the definition of webometrics is "the study of the quantitative aspects of the construction and use of information resources, structures and technologies on the Web drawing on bibliometric and informetric approaches." The term webometrics was first coined by Almind and Ingwersen (1997). A second definition of webometrics has also been introduced, "the study of web-based content with primarily quantitative methods for social science research goals using techniques that are not specific to one field of study" (Thelwall, 2009), which emphasizes the development of applied methods for use in the wider social sciences. The purpose of this alternative definition was to help publicize appropriate methods outside of the information science discipline rather than to replace the original definition within information science.

Similar scientific fields are Bibliometrics, Informetrics, Scientometrics, Virtual ethnography, and Web mining.

One relatively straightforward measure is the "Web Impact Factor" (WIF) introduced by Ingwersen (1998). The WIF measure may be defined as the number of web pages in a web site receiving links from other web sites, divided by the number of web pages published in the site that are accessible to the crawler. However the use of WIF has been disregarded due to the mathematical artifacts derived from power law distributions of these variables. Other similar indicators using size of the institution instead of number of webpages have been proved more useful.

This page is based on a Wikipedia article written by authors (here).
Text is available under the CC BY-SA 3.0 license; additional terms may apply.
Images, videos and audio are available under their respective licenses.