Send us a link

Subscribe to our newsletter

Understanding ORCID Adoption Among Academic Researchers

Understanding ORCID Adoption Among Academic Researchers

Just over a decade ago, the ORCID (Open Researcher and Contributor Identifier) was created to provide a unique digital identifier for researchers around the world. The ORCID has proven essential in identifying individual researchers and their publications, both for bibliometric research analyses and for universities and other organizations tracking the research productivity and impact of their personnel. Yet widespread adoption of the ORCID by individual researchers has proved elusive, with previous studies finding adoption rates ranging from 3% to 42%. Using a national survey of U.S. academic researchers at 31 research universities, we investigate why some researchers adopt an ORCID and some do not. We found an overall adoption rate of 72%, with adoptions rates ranging between academic disciplines from a low of 17% in the visual and performing arts to a high of 93% in biological and biomedical sciences. Many academic journals require an ORCID to submit a manuscript, and this is the main reason why researchers adopt an ORCID. The top three reasons for not having an ORCID are not seeing the benefits, being far enough in the academic career to not need it, and working in an academic discipline where it is not needed.

Science's Golden Oldies: the Decades-old Research Papers Still Heavily Cited Today

Science's Golden Oldies: the Decades-old Research Papers Still Heavily Cited Today

An analysis for Nature reveals the studies that appear most in the reference lists of current publications.

Why Are Women Cited Less Than Men?

Why Are Women Cited Less Than Men?

Strong evidence suggests that women are not cited less per article than men, but that they accumulate fewer citations over time and at the career level. Cary Wu argues that a focus on research productivity is key to understanding and closing the gender citation gap.

ERROR: A Bug Bounty Program for Science

ERROR: A Bug Bounty Program for Science

ERROR is a bug bounty program for science to systematically detect and report errors in academic publications.

Designing for Diversity - What Makes People Pick Up a Science Magazine?

Designing for Diversity - What Makes People Pick Up a Science Magazine?

Jemima Coleman and Wendy Sadler say that science magazines have a responsibility to ensure that science is accessible and inclusive for all

10 Frontiers Articles That Caught the World's Attention in 2022 - Science & Research News

10 Frontiers Articles That Caught the World's Attention in 2022 - Science & Research News

By Frontiers' science writers As part of Frontiers' passion to make science available to all, we highlight just a small selection of the most fascinating research published with us each month to help inspire current and future researchers to achieve their research dreams. 2022 was no different, and saw many game-changing discoveries contribute to the

Focus on PhD Quality, Not Publications: We Need to Encourage Scholars to Become Inquisitive Explorers, Papers Will Naturally Follow

Focus on PhD Quality, Not Publications: We Need to Encourage Scholars to Become Inquisitive Explorers, Papers Will Naturally Follow

Does forcing students to mandatorily publish a research paper before thesis submission lead to a high-quality PhD thesis, or does high-quality PhD work lead to publications in good journals? This question is unlike the chicken...

Who'll Pay for Public Access to Federally Funded Research?

Who'll Pay for Public Access to Federally Funded Research?

The White House painted an incomplete economic picture of its new policy for free, immediate access to research produced with federal grants. Will publishers adapt their business models to comply, or will scholars be on the hook?

Quality Shines when Scientists Use Publishing Tactic Known As Registered Reports, Study Finds

Quality Shines when Scientists Use Publishing Tactic Known As Registered Reports, Study Finds

Papers accepted by journals before results are known rate higher on rigor than standard studies.

Quantitative Quality: a Study on How Performance-based Measures May Change the Publication Patterns of Danish Researchers

Quantitative Quality: a Study on How Performance-based Measures May Change the Publication Patterns of Danish Researchers

Nations the world over are increasingly turning to quantitative performance-based metrics to evaluate the quality of research outputs, as these metrics are abundant and provide an easy measure of ranking research. In 2010, the Danish Ministry of Science and Higher Education followed this trend and began portioning out a percentage of the available research funding according to how many research outputs each Danish university produces. Not all research outputs are eligible: only those published in a curated list of academic journals and publishers, the so-called BFI list, are included. The BFI list is ranked, which may create incentives for academic authors to target certain publication outlets or publication types over others. In this study we examine the potential effect these relatively new research evaluation methods have had on the publication patterns of researchers in Denmark. The study finds that publication behaviors in the Natural Sciences & Technology, Social Sciences and Humanities (SSH) have changed, while the Health Sciences appear unaffected. Researchers in Natural Sciences & Technology appear to focus on high impact journals that reap more BFI points. While researchers in SSH have also increased their focus on the impact of the publication outlet, they also appear to have altered their preferred publication types, publishing more journal articles in the Social Sciences and more anthologies in the Humanities.

An Extensive Analysis of the Presence of Altmetric Data for Web of Science Publications Across Subject Fields and Research Topics

An Extensive Analysis of the Presence of Altmetric Data for Web of Science Publications Across Subject Fields and Research Topics

This paper presents a state-of-the-art analysis of the presence of 12 kinds of altmetric events for nearly 12.3 million Web of Science publications published between 2012 and 2018.

Overcoming the Discoverability Crisis

Overcoming the Discoverability Crisis

The current pandemic has exposed a host of issues with the current scholarly communication system, also with regard to the discoverability of scientific knowledge. Many research groups have pivoted to Covid-19 research without prior experience or adequate preparation. They were immediately confronted with two discovery challenges: (1) having to identify relevant knowledge from unfamiliar (sub-)disciplines with their own terminology and publication culture, and (2) having to keep up with the rapid growth of data and publications and being able to filter out the relevant findings. 

The Pandemic Is Pushing Scientists To Rethink How They Read Research Papers

The Pandemic Is Pushing Scientists To Rethink How They Read Research Papers

The coronavirus pandemic has posed a special challenge for scientists: Figuring out how to make sense of a flood of scientific papers from labs and scientists unfamiliar to them.

Delineating COVID-19 and Coronavirus Research

Delineating COVID-19 and Coronavirus Research

Many initiatives are keeping track of research on COVID-19 and coronaviruses. These initiatives, while valuable because they allow for fast access to relevant research, pose the question of subject delineation. We analyse here one such initiative, the COVID-19 Open Research Dataset (CORD-19).