For the third year in a row, Crossref hosted a roundtable on research integrity prior to the Frankfurt book fair. This year the event looked at Crossmark, our tool to display retractions and other post-publication updates to readers.
Since the start of 2024, we have been carrying out a consultation on Crossmark, gathering feedback and input from a range of members. The roundtable discussion was a chance to check and refine some of the conclusions we’ve come to, and gather more suggestions on the way forward.
In our previous blog post in this series, we explained why no metadata matching strategy can return perfect results. Thankfully, however, this does not mean that it’s impossible to know anything about the quality of matching. Indeed, we can (and should!) measure how close (or far) we are from achieving perfection with our matching. Read on to learn how this can be done!
How about we start with a quiz? Imagine a database of scholarly metadata that needs to be enriched with identifiers, such as ORCIDs or ROR IDs.
We’re in year two of the Resourcing Crossref for Future Sustainability (RCFS) research. This report provides an update on progress to date, specifically on research we’ve conducted to better understand the impact of our fees and possible changes.
Crossref is in a good financial position with our current fees, which haven’t increased in 20 years. This project is seeking to future-proof our fees by:
Making fees more equitable Simplifying our complex fee schedule Rebalancing revenue sources In order to review all aspects of our fees, we’ve planned five projects to look into specific aspects of our current fees that may need to change to achieve the goals above.
On behalf of the Nominating Committee, I’m pleased to share the slate of candidates for the 2024 board election.
Each year we do an open call for board interest. This year, the Nominating Committee received 53 submissions from members worldwide to fill four open board seats.
We maintain a balanced board of 8 large member seats and 8 small member seats. Size is determined based on the organization’s membership tier (small members fall in the $0-$1,650 tiers and large members in the $3,900 - $50,000 tiers).
Typically, when an editorially significant update is made to a document, the publisher will not modify the original document, but will instead issue a separate document (such as a correction or retraction notice) which explains the change. This separate document will have a different DOI from the document that it corrects and will there have different metadata. This process is complementary to versioning.
In this example, article A (with the DOI 10.5555/12345678) is eventually retracted by a retraction notice RN (with the DOI 10.5555/24242424x). Each document has Crossmark metadata, but the fact that RN updates article A is only recorded in the RN’s Crossmark deposit. The Crossmark internal API has to tie the two documents together and indicate in metadata of the original document (A), that it has been updated_by the second document (RN).
The Crossmark part of the metadata schema is used to register updates, but this doesn’t mean that you need to have implemented other parts of Crossmark to deposit updates. In the examples below, in the <crossmark> section you can use only the <update> field in the deposit XML if you don’t usually deposit other Crossmark metadata.
Example 1: simple retraction
This is a simple example of article A being retracted by a retraction notice RN where both A and RN have Crossmark metadata deposited.
First, the PDF is produced and the XML deposited to Crossref.
This is a simple example of article B being corrected by a correction notice CN where both B and CN have Crossmark metadata deposited. The only real difference between this and the previous example is that we are creating a different kind of update.
When a member does not issue a separate update/correction/retraction notice and instead just makes the change to the document (without changing its DOI either), this is called an in-situ update. In-situ updates or corrections are not recommended because they tend to obscure the scholarly record. How do you tell what the differences are between what you downloaded and the update? How do you differentiate them when citing them (remember, we are only talking about “significant updates” here)? However, some members need to support in-situ updates, and this is how they can be supported.
Example 4: correction of article that has no Crossmark metadata deposited
If you deposit Crossmark metadata for a retraction or and update notice which, in turn, points at an article that does not have Crossmark metadata assigned to it, we will generate a “stub” Crossmark for the item being updated. The stub metadata will simply copy essential Crossmark metadata. This metadata can be queried via our API, but won’t activate anything on your site unless you add the Crossmark widget to the corresponding page of the item being updated.
Example 5: correction notice that corrected multiple documents
Sometimes members issue correction or clarification notices which provide corrections for multiple documents. This too can be supported by Crossmark. In the following example, one correction/clarification document provides updates to two documents (F and G)