To date, we have collected around 740 million events from 12 different source since we launched our Event Data service service in 2017. Each event is an online mention of the research associated with a DOI, either via the DOI directly or using the associated URL. However, we know that there is much more out there. Because of this, we would like to explore where we could expand.
We invite proposals to conduct a gap analysis for Event Data sources, looking at what we currently collect and seeing what more could be added.
We are delighted to announce the formation of a new Advisory Group to support us in improving preprint metadata. Preprints have grown in popularity over the last few years, with increasing focus brought by the need to rapidly disseminate knowledge in the midst of a global pandemic. We have supported metadata deposits for preprints under the content type ‘posted content’ since 2016, and members currently register a total of around 17,000 new preprints metadata records each month.
It is time to put the ‘R’ back into R&D.
The Crossref R&D team was originally created to focus on the kinds of research projects that have allowed Crossref to make transformational technology changes, launch innovative new services, and engage with entirely new constituencies. Some Illustrious projects that had their origins in the R&D group include:
DOI Content Negotiation Similarity Check (originally CrossCheck) ORCID (originally Author DOIs) Crossmark The Open Funder Registry The Crossref REST API Linked Clinical Trials Event Data Grant registration ROR And for each project that has graduated, there have been several that have not.
This announcement has been in the works for some time, but everything seems to take longer when there is a pandemic going on, including finding time and headspace to plan out our strategy for the next few years.
Over the last year or so we have had our heads down addressing how to scale our 20-yr-old system and operation – and adapting to new ways of working. But we’ve also spent time talking to people, forging alliances, looking ahead, and making plans.
Just when you thought 2020 couldn’t go any faster, it’s Peer Review week again! Peer Review is such an important part of the research process and highlighting the role it plays is key to retaining and reinforcing trust in the publishing process.
“Maintaining trust in the peer review decision-making process is paramount if we are to solve the world’s most pressing problems. This includes ensuring that the peer review process is transparent (easily discoverable, accessible, and understandable by anyone writing, reviewing, or reading peer-reviewed content) and that everyone involved in the process receives the training and education needed to play their part in making it reliable and trustworthy.”
A key way that publishers can make peer reviews easily discoverable and accessible is by registering them with Crossref - creating a persistent identifier for each review, linking them to the relevant article, and providing rich metadata to show what part this item played in the evolution of the content. It also gives a way to acknowledge the incredible work done by academics in this area.
For Peer Review week last year, Rosa and Rachael from Crossref created this short video to explain more.
Fast forward to 2020 and over 75k peer reviews have now been registered with us by a range of members including Wiley, Peer J, eLife, Stichting SciPost, Emerald, IOP Publishing, Publons, The Royal Society and Copernicus. We encourage all members to register peer reviews with us - and you can keep up to date with everyone who is using this API query. (We recommend installing a JSON viewer for your browser to view these results if you haven’t done so already).
Register peer reviews and contribute to the Research Nexus
At Crossref, we talk a lot about the research nexus, and it’s a theme that you’re going to hear a lot more about from us in the coming months and years.
The published article no longer has the supremacy it once did, and other outputs - and inputs - have increasing importance. Linked data and protocols are key for reproducibility, peer reviews increase trust and show the evolution of knowledge, and other research objects help increase the discoverability of content. Registering these objects and stating the relationships between them support the research nexus.
Peer reviews in particular are key to demonstrating that the scholarly record is not fixed - it’s a living entity that moves and changes over time. Registering peer reviews formally integrates these objects into the scholarly record and makes sure the links between the reviews and the article both exist and persist over time. It allows analysis or research on peer reviews and highlights richer discussions than those provided by the article alone, showing how discussion and conversation help to evolve knowledge. In particular, post-publication reviews highlight how the article is no longer the endpoint - after publication, research is further validated (or not!) and new ideas emerge and build on each other. You can see a real-life example of this from F1000 in a blog post written by Jennifer Lin a few years ago.
As we’ve said before:
Article metadata + peer review metadata = a fuller picture of the evolution of knowledge
Registering peer reviews also provides publishing transparency and reviewer accountability, and enables contributors to get credit for their work. If peer review metadata includes ORCID IDs, our ORCID auto-update service means that we can automatically update the author’s ORCID record (with their permission), while our forthcoming schema update will take this even further, making CRediT roles available in our schema.
How to register peer reviews with Crossref
You need to be a member of Crossref in order to register your peer reviews with us and you can currently register peer reviews by sending us your XML files. Unfortunately, you can’t currently register peer reviews using our helper tools like the OJS plugin, Metadata Manager, or the web deposit form.
We know that there’s a range of outputs from the peer review process, and our schema allows you to identify many of them, including referee reports, decision letters, and author responses. You can include outputs from the initial submission only, or cover all subsequent rounds of revisions, giving a really clear picture of the evolution of the article. Members can even register content for discussions after the article was published, such as post-publication reviews.
Get involved with Peer Review Week 2020
We’re looking forward to seeing the debate sparked by Peer Review Week and hearing from our members about this important area. You can get involved by checking out the Peer Review Week 2020 website or following @PeerRevWeek and the hashtags #PeerRevWk20 #trustinpeerreview on Twitter.
We’re excited to see what examples of the evolution of knowledge will be discoverable in registered and linked peer reviews this time next year!