https://doi.org/10.13003/axeer1ee
In our previous entry, we explained that thorough evaluation is key to understanding a matching strategy’s performance. While evaluation is what allows us to assess the correctness of matching, choosing the best matching strategy is, unfortunately, not as simple as selecting the one that yields the best matches. Instead, these decisions usually depend on weighing multiple factors based on your particular circumstances. This is true not only for metadata matching, but for many technical choices that require navigating trade-offs.
Looking back over 2024, we wanted to reflect on where we are in meeting our goals, and report on the progress and plans that affect you - our community of 21,000 organisational members as well as the vast number of research initiatives and scientific bodies that rely on Crossref metadata.
In this post, we will give an update on our roadmap, including what is completed, underway, and up next, and a bit about what’s paused and why.
The Crossref2024 annual meeting gathered our community for a packed agenda of updates, demos, and lively discussions on advancing our shared goals. The day was filled with insights and energy, from practical demos of Crossref’s latest API features to community reflections on the Research Nexus initiative and the Board elections.
Our Board elections are always the focal point of the Annual Meeting. We want to start reflecting on the day by congratulating our newly elected board members: Katharina Rieck from Austrian Science Fund (FWF), Lisa Schiff from California Digital Library, Aaron Wood from American Psychological Association, and Amanda Ward from Taylor and Francis, who will officially join (and re-join) in January 2025.
Background The Principles of Open Scholarly Infrastructure (POSI) provides a set of guidelines for operating open infrastructure in service to the scholarly community. It sets out 16 points to ensure that the infrastructure on which the scholarly and research communities rely is openly governed, sustainable, and replicable. Each POSI adopter regularly reviews progress, conducts periodic audits, and self-reports how they’re working towards each of the principles.
In 2020, Crossref’s board voted to adopt the Principles of Open Scholarly Infrastructure, and we completed our first self-audit.
To work out which version you’re on, take a look at the website address that you use to access iThenticate. If you go to ithenticate.com then you are using v1. If you use a bespoke URL, https://crossref-[your member ID].turnitin.com/ then you are using v2.
Use doc-to-doc comparison to compare a primary uploaded document with up to five comparison uploaded documents. Any documents that you upload to doc-to-doc comparison will not be indexed and will not be searchable against any future submissions.
Uploading a primary document to doc-to-doc comparison will cost you a single document submission, but the comparison documents uploaded will not cost you any submissions.
Start from Folders, go to the Submit a document menu, and click Doc-to-Doc Comparison.
The doc-to-doc comparison screen allows you to choose one primary document and up to five comparison documents. Choose the destination folder for the documents you will upload. The Similarity Report for the comparison will be added to the same folder.
For your primary document, provide the author’s first name, last name, and document title. If you do not provide these details, the filename will be used for the title, and the author details will stay blank.
If you have administrator permissions, you can assign the Similarity Report for the comparison to a reporting group by selecting one from the Reporting Group drop-down. Learn more about reporting groups.
Click Choose File, and select the file you want to upload as your primary document. See the file requirements for both the primary and comparison documents on the right of the screen.
You can choose up to five comparison documents to check against your primary document. These do not need to be given titles and author details. Each of the filenames must be unique. Click Choose Files, and select the files you would like to upload as comparison documents. To remove a file from the comparison before you upload it, click the X icon next to the file. To upload your files for comparison, click Upload.
Once your document has been uploaded and compared against the comparison documents, it will appear in your chosen destination folder.
This upload will have ‘Doc-to-Doc Comparison’ beneath the document title to show that this is a comparison upload and has not been indexed.
The upload will be given a Similarity Score against the selected comparison documents, which is also displayed in the report column. Click the similarity percentage to open the doc-to-doc comparison in the Document Viewer.
The Document Viewer is separated into three sections:
Along the top of the screen, the paper information bar shows details about the primary document, including document title, author, date the report was processed, word count, number of comparison documents provided, and how many of those documents matched with the primary document.
On the left panel is the paper text - this is the text of your primary document. Matching text is highlighted in red.
Your comparison documents will appear in the sources panel to the right, showing instances of matching text within the submitted documents.
By default, the doc-to-doc comparison will open the Document Viewer in the All Sources view. This view lists all the comparison documents you uploaded. Each comparison document has a percentage showing the amount of content within them that is similar to the primary document. If a comparison document has no matching text with the primary document, it has 0% next to it.
Doc-to-doc comparison can also be viewed in Match Overview mode. In this view, the comparison documents are listed with highest match percentage first, and all the sources are shown together, color-coded, on the paper text.
Page owner: Kathleen Luschek | Last updated 2020-May-19