Prioritization in Science: How Do We Decide What Research Matters Most?

Prioritization in Science: How Do We Decide What Research Matters Most?

Every day, thousands of research papers hit the shelves, each vying for attention in an already overloaded system. Recent bibliometrics reveal that scientific publications have surged by 8-9% annually, for decades. In biomedicine alone, over a million papers flood the PubMed database each year—roughly two papers every minute (Nature, 2016).

With the sheer volume of research articles and funding applications published daily, it’s nearly impossible for scientists, funders, and consumers alike to keep pace. As a result, vital research often risks slipping through the cracks. 

So, how do we determine what deserves our time, resources, and focus? And why do we continue to rely on outdated metrics, given the high stakes involved? Science needs a system where the most innovative, reproducible, and impactful research stands out—a democratic ranking approach that guides us to invest in what truly matters.

Problems with Prioritization in Research

1 Flawed Metrics

Currently, ranking in academia is heavily driven by journal impact factors and citation counts. These metrics determine not only the perceived quality of scientific work but also researchers’ careers, funding prospects, and reputation.

This system is flawed.

Because the impact factor of a journal is a key metric for assessing a work's success—and often influences funding opportunities and career advancement—scientists face intense pressure to publish in high-ranking journals. This pressure can fuel a bias toward positive, eye-catching results, at times at the expense of rigorous research, and contribute to the problem of reproducibility in science. In fact, studies have found no significant correlation between a journal’s rank and the methodological soundness of the studies within it (Brembs. B., 2013). This suggests that journal prestige does not necessarily align with research quality.

Citations can often reflect the popularity of a topic or strategic citation practices rather than the quality of the research or its relevance to market needs. Publications in high-ranking journals gain wider exposure—not only due to the journal's extensive reach but also from the increased media attention these journals attract. In this landscape, researchers may strategically cite popular papers from high-ranking journals to boost their own visibility, shifting the value of publications toward social currency instead of scientific merit. Consequently, we risk prioritizing research based on flawed metrics, sidelining potentially groundbreaking work that addresses real-world needs simply because it didn’t make it into a top-tier journal.

2 Money and Time Sunk in Research

In his foundational blog post that launched ResearchHub, Brian Armstrong, highlighted a critical flaw in the research funding landscape: funders often allocate resources to hypotheses and issues without a clear path to meaningful applications, underscoring a misalignment between research investments and real-world impact. This disconnect is further compounded by weak feedback mechanisms from private industries, which could otherwise inform researchers about the most pressing challenges they face and guide funding toward areas of urgent need. Without this essential insight into market incentives, research funding frequently becomes misdirected toward projects with limited potential for commercialization or meaningful real-world application. ResearchHub was founded with this awareness, striving to bridge these gaps and direct resources to research with scientific impact.

To prioritize research effectively, we need to create a system that not only streamlines funding processes but also actively solicits input from industry stakeholders about pressing challenges. By addressing the inefficiencies in funding, we can better allocate time and resources to research that truly matters.

3 Skewed Priorities of Major Funding Bodies

Traditionally, science is funded by two main stakeholders: the government and the private business sector (big pharma, etc). These stakeholders are known to set their funding priorities based on their overarching missions and goals for safe returns. Since these funding bodies allocate grants based on the perceived potential of the application to achieve their own goals, safe science is often prioritized, often leaving fundamental, exploratory research underfunded.

Business investments in research and development have since surpassed federal contributions. However, they also focus on applied research. While applied research rightly benefits product development and commercialization directly, there is still limited private support for early-stage research. Federal funding, meant to balance this by focusing on foundational research, has seen its share decline as part of the GDP since the mid-1990s. Budget caps for fiscal years 2024 and 2025 have confined science to compete with a smaller discretionary budget that also funds public health, education, and veterans' services. Although the public generally supports investments in science, politically influenced funding structures complicate efforts to prioritize basic, fundamental research. Federal agencies negotiate priorities based on immediate political and economic interests, often sidelining the sustained, curiosity-driven research projects.

For example, the United States’ 2024 fiscal year science budget focuses on applied and technology-driven research, especially in areas such as semiconductor production under the CHIPS and Science Act and targeted innovation hubs. Yes, this focus aligns with national competitiveness goals but leaves gaps in funding for basic science, crucial for long-term innovation.

This creates a funding landscape where certain areas of science, especially high-risk, high reward research, struggle to secure adequate funding

How DeSci DAOs Help Solve the Problems with Prioritization in Research

Community-driven Funding for Specific Areas of Research

In a recent blog post about reforming scientific funding, we highlight how DeSci DAOs (Decentralized Autonomous Organization) offer community-driven answers to the questions around prioritizing what research to fund. These DAOs are communities directing funding towards specific areas of science that are chosen and supported by people with a direct need for the research outcomes.

Unlike traditional funding models, where the prioritization of what research to carry out may be driven by earlier-mentioned academic prestige or limited traditional funding opportunities, DeSci DAOs allow stakeholders to collectively prioritize which research is funded and ultimately carried out.

Creating Research People Actually Need: How Community Funding Focuses on Real-World Impact

Prioritizing what research to spend time and money on requires a keen understanding of market needs and the potential impact of the research. This focus on finding market fit establishes the relevance of the research and increases the likelihood of its successful commercialization. Researchers can identify if and how their work is most likely to translate into viable solutions by aligning research efforts with the demands of stakeholders.

In traditional funding models, there is often a disconnect between researchers and the end-users of their findings. DAOs are emerging as innovative vehicles for funding specific areas of research, aligning financial resources directly with stakeholder needs. Through community-led funding models, stakeholders can focus on funding research initiatives that promise to translate into finished products that benefit them. This approach creates an inherent product-market fit, as the parties investing in the research have a vested interest in its outcomes. 

A notable example is HairDAO–an open-source network where patients and researchers collaborate to develop treatments for hair loss. Since its inception. HairDAO has successfully filed two patents and launched the first DeSci consumer product–“Follicool”. Here, the market fit is quite evident–the community of patients and other stakeholders who actively decided what to fund and helped develop the product by logging their treatment experiences.

This way, the likelihood of wasting resources on research with limited market demand is reduced by empowering those who will ultimately benefit from the research to allocate funds.

ResearchHub’s Answers to the Problems of Prioritization in Research

ResearchHub is a tool for the open publication and discussion of scientific research. Some of ResearchHub’s answers include:

A Collaborative Platform for Accelerating Scientific Research

ResearchHub’s design directly addresses the structural issues in the traditional scientific economy: funders, scientists, and journals all currently operate within separate incentive systems. Funders aim to support high-impact research, scientists need to secure funding to produce knowledge, and journals seek to profit from curating and distributing research. This misalignment results in an inefficient distribution of resources and limits access to scientific knowledge.

Rooted in the belief that scientific knowledge should be openly accessible and reusable, ResearchHub’s collaborative platform allows its global community of scientists to openly share, discuss, rank, and fund scientific research.

Spotlight Impactful Research with Crowd-sourced Expertise

ResearchHub offers a community-driven approach to ranking and prioritizing scientific work by using crowd-sourced upvotes and reputation scores (REP) to filter and highlight valuable research. Unlike the slow, profit-driven process of prestigious journals, which can delay scientific communication through rounds of submissions, rejections, and resubmissions, ResearchHub accelerates knowledge sharing by enabling experts to directly highlight the most impactful work. 

This upvoting mechanism allows users to identify high-impact research quickly and filter them to the top, creating a new signaling mechanism that makes it easier for researchers and funders to discover valuable studies in real time.

Directly Crowdfund Preregistrations

In traditional funding, grant decisions are often guided by two main goals: supporting research that offers a safe return on investment and aligns with the funders' organizational objectives. These decisions frequently rely on subjective evaluations of proposals, with significant resources devoted to administrative processes. This conventional model uses outdated metrics to gauge the potential impact of scientific work, leading to a disconnect between awarded funding and the real needs of researchers and stakeholders.

ResearchHub offers a transparent, community-driven approach for evaluating and supporting research projects. By directly crowdfunding preregistrations – an approach proven to enhance reproducibility in science (Nosek et. al, 2018; Sarafoglou et. al, 2022)–ResearchHub prioritizes scientific potential over institutional risk, aligning funding more closely with community-supported research goals.

ResearchHub is a better, democratic way to publish, review, fund research.

References

[1] Armstrong, B. (2019, February 25). Ideas on how to improve scientific research - Brian Armstrong. Medium. https://barmstrong.medium.com/ideas-on-how-to-improve-scientific-research-9e2e56474132 

[2] Brembs, B., Button, K., & Munafò, M. (2013). Deep impact: unintended consequences of journal rank. Frontiers in Human Neuroscience, 7, 291. https://doi.org/10.3389/fnhum.2013.00291 

[3] Landhuis, E. (2016). Scientific literature: Information overload. Nature, 535(7612), 457–458. https://doi.org/10.1038/nj7612-457a 

[4] Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences of the United States of America, 115(11), 2600–2606. https://doi.org/10.1073/pnas.1708274114 

[5] Sarafoglou, A., Kovacs, M., Bakos, B., Wagenmakers, E.-J., & Aczel, B. (2022). A survey on how preregistration affects the research workflow: better science but more work. Royal Society Open Science, 9(7). https://doi.org/10.1098/rsos.211997