Objectives Curiosity and altruism are characteristics that we endeavour to nurture within the scientific community. Unfortunately, competition for funding and employment stifle such qualities and incentivize poor scientific practice. In efforts to make decisions about funding and employment more evidence-based, several quantitative metrics intended to reflect the proficiency of a scientist have emerged. Although such metrics were intended to be ‘objective,’ it is common to manipulate such metrics to one’s own advantage. Studies can be published in a piece-meal fashion to increase the number of publications, and scientists can cite their own work even though citing other literature would be a suitable alternative. This is a corrupt cycle that rewards those who use poor scientific practices to inflate these metrics and subsequently acquire more funding in favour of the honest scientist.
Method Given the widespread use of quantitative metrics as a means of assessing performance and funding allocation, one possible solution to combat poor scientific practice may be the introduction of a quantitative metric that measures a scientist’s commitment to transparency and open-access (the author suggests this is named the ‘Altman index,’ in remembrance of Professor Doug Altman and his commitment to scientific integrity). The amount of information available to financial and socio-political stakeholders about a scientist is overwhelming, and decision-making processes regarding funding are undoubtedly influenced by cognitive biases. In addition to open-access journals, there now exists a plethora of open-access tools such as Plaudit,1 Protocols.io,2 and the recently announced Reproducible Document Stack.3
Results To alleviate pressures on human resources dedicated to assessing a scientist’s proficiency, the Altman index would centralise and integrate data produced by end-users of open-access tools into a comprehensive and interpretable metric. This would make open-access more tangible for all stakeholders and allow them to redirect their focus on the humanistic aspects of research. Additionally, making the Altman index publicly accessible would empower those members of the community who want to hold scientists and academic institutions socially accountable.
Moreover, the lack of incentive to publish negative results means that the extant scientific literature is not an accurate representation of reality, and authors may even fabricate or manipulate their results to make publication more likely. The Altman index could also be used to address this bias by rewarding the publication of negative results and making dataset and methodology publicly available.
Conclusions As algorithmic approaches to big data are the current zeitgeist in scientific research, it is important to note that the Altman index is only a starting point in addressing the redistribution of funds to those who have shown commitment to scientific integrity and transparency. Additionally, there is scepticism amongst many scientists and the public about such approaches, so transparency and open-mindedness are crucial in developing the Altman index. As has happened with other metrics, there may be unanticipated adverse consequences with the introduction of the Altman index, and it is important to invest time designing the Altman index in a way that minimises the risk of perverse incentivisation.
Statistics from Altmetric.com
If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.