Subscribe For More!

Get the latest creative news from us about politics, business, sport and travel

 
Subscription Form
Edit Template

Free Academic Databases To Replace Web Of Science

Can free databases truly replace paid systems in science evaluation and reshape who gets funding and recognition in academia today?
Free Academic Databases To Replace Web Of Science

A substantial share of public research funding is spent each year on university subscriptions to academic databases, most notably the Web of Science (WoS), the long-established industry standard for journal rankings, research evaluation, and funding decisions. For the general public, this means that much of the research supported by taxpayers remains locked behind paywalls. For early-career researchers, it means that career opportunities and access to funding are still closely tied to a system shaped by commercial providers.

Since the launch of the Budapest Open Access Initiative (BOAI) in 2002, the global open science movement has promoted a different vision: one in which free and open databases make scholarly information easier to discover, track, and use. Tools such as Unpaywall have already helped expand access to research. But a critical question remains unanswered: can these free databases serve as reliable alternatives to Web of Science for measuring scientific impact within the science system and beyond? A study led by Prof. Shuo Xu at Beijing University of Technology, published in the Journal of Informetrics, puts this question to the test. The paper, titled “Do OpenCitations and Dimensions serve as an alternative to Web of Science for calculating disruption indexes”, compares the three databases head-to-head.

What’s at stake? The power of citation data

Most people outside academia rarely see how strongly academic bibliographic databases influence the direction of science. At the center of this system is citation data: the record of how often other researchers cite a given research paper. For decades, these records have underpinned a range of research indicators that can influence hiring, tenure, promotion, funding, and assessments of scholarly impact.

One of the most closely watched of these measures is the disruption index (DI). In simple terms, it is designed to show whether a paper opens up a new line of research or mainly develops existing ideas. Because the metric depends on complete and reliable citation links, gaps in the underlying data can distort the result.

For decades, the WoS has played a dominant role in providing this citation data. But its high subscription costs have limited access for many institutions, especially in low- and middle-income countries, contributing to a more unequal global research system. That raises an important question: can open and lower-cost databases provide the same foundation for evaluating scientific influence?

Three databases, three access models

To find out whether open alternatives could match Web of Science, the researchers turned to two major databases, each offering a different level of access to citation data.

First is the Dimensions platform, which operates under a freemium model: free for individual use and basic searches, but with restricted access for large-scale data extraction. It has expanded rapidly and now ranks among the world’s largest linked research databases, covering more than 140 million publications.

Second is the OpenCitations, a fully open-access database developed to make citation data freely available without subscription barriers or paywalls.

Earlier comparisons had mostly focused on coverage, that is, how many papers each database contains. What had not been rigorously tested was whether they could produce similar results when used for research evaluation. In other words, it remained unclear whether these databases could yield results comparable to those from Web of Science for evaluative purposes.

Evidence from four fields of science

The comparison covered four research areas: two well-established fields (synthetic biology and astronomy and astrophysics), and two emerging ones (blockchain-based information systems management and the socio-economic impacts of biological invasions). For each field, the researchers calculated disruption index (DI) scores for hundreds of papers across the three databases and compared how closely the results aligned.

Taken together, the results indicate that the Dimensions aligns more closely with the WoS than the OpenCitations does, in both established and emerging fields. Its broader coverage appears to make it a more reliable alternative for disruption index calculation, even though some differences remain. The OpenCitations offers the advantages of full openness and easier access, but gaps in citation data, particularly from paywalled content, still limit its performance. More broadly, the findings highlight the practical trade-offs involved in choosing between open, restricted, and closed bibliographic databases.

What this means for the future of open science

Open access is not only about making research papers free to read, but also about making the bibliographic databases used to evaluate science more widely accessible. This comparison suggests that progress is possible, but incomplete. A lower-cost database can now produce results much closer to Web of Science than a fully open alternative can, which may help reduce barriers to research evaluation for institutions with limited resources.

At the same time, the results also show that full openness does not yet guarantee full reliability. The OpenCitations remains a valuable open resource, but gaps in citation coverage still limit its use in high-stakes assessment. More broadly, the comparison highlights a central challenge for open access: openness, affordability, and analytical reliability do not always advance in step.

Making research papers open is only part of the story. If citation data remain locked behind paywalls, research evaluation will remain unequal.

Xin An 

Reference

Xu, S., Wang, C., An, X., Deng, Y., & Liu, J. (2025). Do OpenCitations and Dimensions serve as an alternative to Web of Science for calculating disruption indexes?. Journal of Informetrics, 19(3), 101685. https://doi.org/10.1016/j.joi.2025.101685

Coauthors

Shuo Xu

 

Shuo Xu is a Professor at the College of Economics and Management, Beijing University of Technology. His research focuses on technology foresight, scientometrics, industrial analysis, and big data mining. Dr. Xu has published extensively in leading journals and presented at major international conferences. He has been listed among the World’s Top 2% Scientists for five consecutive years (2021–2025).

Congcong Wang


Congcong Wang is a PhD candidate at the School of Economics and Management, Beijing University of Technology. Her research interests center on scientometrics and data mining.
Yunkang Deng

 

Yunkang Deng holds a Master’s degree in Applied Statistics from the School of Economics and Management, Beijing Forestry University. His research focuses primarily on citation network analysis.

Jianhua Liu

Jianhua Liu is a researcher at Beijing Wanfang Data Co., Ltd, which is invested by Institute of Science and Technology Information of China (ISTIC). She is responsible for exploring new technologies and applications about knowledge services. Her research focuses on big scholar data mining, open research data, informetrics and research integrity currently. Dr Liu has published more than 50 research articles, and obtained 5 patents.

Key Insights

Open databases can democratise research evaluation.
Open science faces trade-offs between access and reliability.
Citation data shape academic careers, funding, and global visibility.
Dimensions currently offers the closest practical alternative to Web of Science.
OpenCitations is fully accessible, but citation gaps still limit its reliability.

Related Articles

Subscription Form

© 2025 all rights received by thesciencematters.org