Gray Matters: Social Media Changing the World of Publication Metrics

0
916

Like so much of today’s society, social media affects the world of publication metrics – a critical component of academic promotion and tenure. Traditional bibliometrics apply analytic tools and statistical methods to examine scholarly productivity.1-5 These relate largely conventional publications and databases, such as PubMed, that are accessed and quantified to provide an estimate of an individual’s academic performance.1,2 A physician’s curriculum vitae (CV) lists their efforts as well. Online views, self-publication and dissemination through social media platforms, including Twitter and Instagram, are increasingly influential. How can this conundrum be successfully navigated?

Traditional Metrics

o

The wide variability in academic productivity analytic tools prompted JE Hirsch to promulgate the h-index in 2005. This index derives from the number of published papers by an individual with the same number of citations.2 A researcher with twelve published manuscripts, with at least twelve citations per manuscript, has an h-index of 12. This novel way of measuring scientific productivity allows comparison of an individual to his or her academic peers and it is easily available through online databases, such as the Web of Science, Scopus and Google Scholar.1 Many medical and non-medical scientific disciplines have embraced this measure, including neurosurgery. A study of 1,120 academic neurosurgeons noted an average h-index of 9, and an increasing h-index linked to increasing academic rank.1 Like any measure, the h-index has limitations; it depends on the accessibility and fidelity of the database used to determine it, may be susceptible to repeated self-citation and has a “ceiling effect” (some papers may not be accorded credit in the h-index if the number of citations overall for that authors work are low). It also gives the same amount of weight to review articles as to original scientific endeavors, there is a slight bias for senior researchers who have a longer window of time to accumulate citations and it favors fields with greater numbers of researchers and publications. 1,5,6

Given the potential limitations, the following modifications to the h-index have been proposed:

  • The “m” index, defined as the h-index divided by the number of years since the individual’s first publication. This will address the issue of author seniority.1,2,5,6
  • The g-index takes into account papers by an author that have a greater impact than other works that may be less cited.3,6
  • The e-index is the square root of the difference between the total number of citations in h papers minus h2.4,6
  • Google’s i10 index is the number of articles with 10 or more citations.5,6

All these metrics try to distill the significance of a researcher’s work. In the end, however, the utility of any index of academic productivity is only valid when viewed in the context of an individual’s entire portfolio in a particular field, as assessed by their peers; they are an excellent point of reference that can strongly guide additional discussion (Table 1)1-6. Decisions regarding academic advancement also may take into account other factors, such as departmental and organizational citizenship; clinical volume and outcomes; quality metrics; teaching and mentorship; board certification; leadership in professional organizations; acquisition of research funding; and presentations at national or international conferences.1,5


Read More

Innovation in Medicine and Social Media: An Interview with KevinMD
Beyond the the Surgeons’ Lounge: Leveraging Twitter for Networking and Career Advancement
Prevent Online Disasters


Social Media’s Impact

The evolution of the Internet and social media platforms has altered the dynamics of scientific access and visibility.7-10 Internet searches yield substantial results on almost any topic and guide subsequent forays into a deeper review of subject matter; with non-PubMed indexed sources of increasing value. The Internet also changed the dynamics of the publication process; communication is rapid, collaboration between researchers easier, scholarly output accessible and publication costs decreased.7 Service providers with advanced software algorithms have implemented the semantic web to promote interoperability between different systems. Perhaps most important is the fact that articles are now individualized with their own unique identity, rather than grouped together in a journal issue. This makes them more accessible and allows appropriation of credit to individual authors and more accurately influences hiring, promotion and funding decisions.7

THE EXPERTS WEIGH IN

Gail Y. Hendler, MLS, Health sciences library, loyola university chicago

Scholarly productivity, seen through the lens of promotion and tenure, relies on citation analysis, which measures both the impact of an individual author’s productivity over time (h-index) and its influence, as determined by journal rank (impact factor). Citation analysis is the traditional measure used to analyze the impact of research on other research. 1 Altmetrics uses data derived from the web (social media), such as page views, Facebook likes, Tweets as well as article/citation downloads, bookmarks, saves, shares and comments to quantify a publication’s interest, attention, influence and reach.1,2 Viewed together, traditional and alternative metrics communicate a broader, more diverse and fuller description of a researcher’s work.

The use of altmetrics is gaining traction for many scholars precisely because they capture a wide array of data that document evidence of how quickly and broadly their work is noticed, communicated, shared and discussed across and beyond the scientific community. Scientists are including altmetrics in their CVs, grant applications, NIH biosketch profiles and promotion packages as indicators of the diverse reach and influence of their work and to reveal the fuller impact of their research that is missed by traditional metrics.1,2 Potentially, altmetrics can track research flow and impact to society, complying with policy set by funding agencies, such as the National Science Foundation’s broader impact mandate.1,3

There is growing literature about altmetrics that studies how it compares and complements traditional metrics. One recent paper characterized and studied recent Parkinson’s disease literature, which had received the highest Altmetric Attention Scores and compared that measure to traditional metrics. The findings revealed that altmetrics offer insight and a new perspective into attention surrounding scholarship and confirm that they provide an index to scientific success when approached judiciously in an era of alt-facts and alt-news.4 While there are advantages and disadvantages to both citation-based and web-based metrics, each provide a unique perspective about research and offer complementary insights that are critical in evaluating reach, output and impact. 

 

REFERENCES

1. Spearman, C. M., Quigley, M. J., Quigley, M. R., & Wilberger, J. E. (2010). Survey of the h-index for all of academic neurosurgery: another power-law phenomenon? Journal of Neurosurgery, 113(5), 929-933.

2. Hirsch, J. E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 16569-16572.

3. Egghe, L. (2006). Theory and practice of the g-index. Scientometrics, 69, 131-152.

4. Zhang, C. (2009). The e-Index, Complementing the h-Index for Excess Citations. PLoS ONE, 4(5).

5. Khan, N. R., Thompson, C. J., Taylor, D. R., Venable, G. T., Wham, R. M., Michael, L. M., & Klimo, P. (2014). An analysis of publication productivity for 1225 academic neurosurgeons and 99 departments in the United States. Journal of Neurosurgery, 120(3), 746-755.

6. Maabreh, M., & Alsmadi, I. M. (2012). A survey of impact and citation indices: Limitations and issues. International Journal of Advanced Science and Technology, 40, 35-54.

7. Melero, R. (2015). Altmetrics – a complement to conventional metrics. Biochemia Medica, 25(2), 152-160.

8. Hicks, D., Wouters, P., Waltman, L., Rijcke, S. D., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431.

9. Piwowar, H. (2013). Altmetrics: Value all research products. Nature, 493(7431), 159.

10. Cheung, M. K. (2013). Altmetrics: Too soon for use in assessment. Nature, 494(7436), 176-176.

11. Barbic, D., Tubman, M., Lam, H., & Barbic, S. (2016). An analysis of altmetrics in emergency medicine. Academic Emergency Medicine, 23(3), 251-268.

[aans_authors]

Print Friendly, PDF & Email
o