For the third year in a row, Crossref hosted a roundtable on research integrity prior to the Frankfurt book fair. This year the event looked at Crossmark, our tool to display retractions and other post-publication updates to readers.
Since the start of 2024, we have been carrying out a consultation on Crossmark, gathering feedback and input from a range of members. The roundtable discussion was a chance to check and refine some of the conclusions we’ve come to, and gather more suggestions on the way forward.
In our previous blog post in this series, we explained why no metadata matching strategy can return perfect results. Thankfully, however, this does not mean that it’s impossible to know anything about the quality of matching. Indeed, we can (and should!) measure how close (or far) we are from achieving perfection with our matching. Read on to learn how this can be done!
How about we start with a quiz? Imagine a database of scholarly metadata that needs to be enriched with identifiers, such as ORCIDs or ROR IDs.
We’re in year two of the Resourcing Crossref for Future Sustainability (RCFS) research. This report provides an update on progress to date, specifically on research we’ve conducted to better understand the impact of our fees and possible changes.
Crossref is in a good financial position with our current fees, which haven’t increased in 20 years. This project is seeking to future-proof our fees by:
Making fees more equitable Simplifying our complex fee schedule Rebalancing revenue sources In order to review all aspects of our fees, we’ve planned five projects to look into specific aspects of our current fees that may need to change to achieve the goals above.
On behalf of the Nominating Committee, I’m pleased to share the slate of candidates for the 2024 board election.
Each year we do an open call for board interest. This year, the Nominating Committee received 53 submissions from members worldwide to fill four open board seats.
We maintain a balanced board of 8 large member seats and 8 small member seats. Size is determined based on the organization’s membership tier (small members fall in the $0-$1,650 tiers and large members in the $3,900 - $50,000 tiers).
Members can participate in Cited-by by completing the following steps:
Deposit references for one or more prefixes as part of your content registration process. Use your Participation Report to see your progress with depositing references. This step is not mandatory, but highly recommended to ensure that your citation counts are complete.
We will match the metadata in the references to DOIs to establish Cited-by links in the database. As new content is registered, we automatically update the citations and, for those members with Cited-by alerts enabled, we notify you of the new links.
Display the links on your website. We recommend displaying citations you retrieve on DOI landing pages, for example:
If you are a member through a Sponsor, you may have access to Cited-by through your sponsor – please contact them for more details. OJS users can use the Cited-by plugin.
Citation matching
Members sometimes submit references without including a DOI tag for the cited work. When this happens, we look for a match based on the metadata provided. If we find one, the reference metadata is updated with the DOI and we add the "doi-asserted-by": "crossref" tag. If we don’t find a match immediately, we will try again at a later date.
There are some references for which we won’t find matches, for example where a DOI has been registered with an agency other than Crossref (such as DataCite) or if the reference refers to an object without a DOI, including conferences, manuals, blog posts, and some journals’ articles.
To perform matching, we first check if a DOI tag is included in the reference metadata. If so, we assume it is correct and link the corresponding work. If there isn’t a DOI tag, we perform a search using the metadata supplied and select candidate results by thresholding. The best match is found through a further validation process. Learn more about how we match references. The same process is used for the results shown on our Simple Text Query tool.
All citations to a work are returned in the corresponding Cited-by query.
Page owner: Isaac Farley | Last updated 2023-April-28