Crossref aims to link research together, making related items more findable, increasing transparency, and showing how ideas spread and develop. There are a number of moving parts in this effort: some related to capturing and storing linking information, others to making it available.
By including relationship metadata in Event Data, we are taking a big step to improve the visibility of a large number of links between metadata. We know this is long-promised and we’re pleased that making this valuable metadata available supports a number of important initiatives.
TL:DR; Hi, I’m Joel GitLab UI unsatisfactory Wrote a UI to use the API Wrote a missing API Open company contributes changes back to another open company Now have a method for getting work done much easier Hurrah! I’m Joel, a Senior Site Reliability Engineer here at Crossref. I have a long background in open source, software development, and solving unique problems. One of my earliest computer influences was my father.
Some of you who have submitted content to us during the first two months of 2021 may have experienced content registration delays. We noticed; you did, too.
The time between us receiving XML from members, to the content being registered with us and the DOI resolving to the correct resolution URL, is usually a matter of minutes. Some submissions take longer - for example, book registrations with large reference lists, or very large files from larger publishers can take up to 24 to 48 hours to process.
TL;DR: We have a Community Forum (yay!), you can come and join it here: community.crossref.org.
Community is fundamental to us at Crossref, we wouldn’t be where we are or achieve the great things we do without the involvement of you, our diverse and engaged members and users. Crossref was founded as a collaboration of publishers with the shared goal of making links between research outputs easier, building a foundational infrastructure making research easier to find, cite, link, assess, and re-use.
Use doc-to-doc comparison to compare a primary uploaded document with up to five comparison uploaded documents. Any documents that you upload to doc-to-doc comparison will not be indexed and will not be searchable against any future submissions.
Uploading a primary document to doc-to-doc comparison will cost you a single document submission, but the comparison documents uploaded will not cost you any submissions.
How to use doc-to-doc comparison
Start from Folders, go to the Submit a document menu, and click Doc-to-Doc Comparison.
The doc-to-doc comparison screen allows you to choose one primary document and up to five comparison documents. Choose the destination folder for the documents you will upload. The Similarity Report for the comparison will be added to the same folder.
For your primary document, provide the author’s first name, last name, and document title. If you do not provide these details, the filename will be used for the title, and the author details will stay blank.
If you have administrator permissions, you can assign the Similarity Report for the comparison to a reporting group by selecting one from the Reporting Group drop-down. Learn more about reporting groups.
Click Choose File, and select the file you want to upload as your primary document. See the file requirements for both the primary and comparison documents on the right of the screen.
You can choose up to five comparison documents to check against your primary document. These do not need to be given titles and author details. Each of the filenames must be unique. Click Choose Files, and select the files you would like to upload as comparison documents. To remove a file from the comparison before you upload it, click the X icon next to the file. To upload your files for comparison, click Upload.
Once your document has been uploaded and compared against the comparison documents, it will appear in your chosen destination folder.
This upload will have ‘Doc-to-Doc Comparison’ beneath the document title to show that this is a comparison upload and has not been indexed.
The upload will be given a Similarity Score against the selected comparison documents, which is also displayed in the report column. Click the similarity percentage to open the doc-to-doc comparison in the Document Viewer.
The Document Viewer is separated into three sections:
Along the top of the screen, the paper information bar shows details about the primary document, including document title, author, date the report was processed, word count, number of comparison documents provided, and how many of those documents matched with the primary document.
On the left panel is the paper text - this is the text of your primary document. Matching text is highlighted in red.
Your comparison documents will appear in the sources panel to the right, showing instances of matching text within the submitted documents.
By default, the doc-to-doc comparison will open the Document Viewer in the All Sources view. This view lists all the comparison documents you uploaded. Each comparison document has a percentage showing the amount of content within them that is similar to the primary document. If a comparison document has no matching text with the primary document, it has 0% next to it.
Doc-to-doc comparison can also be viewed in Match Overview mode. In this view, the comparison documents are listed with highest match percentage first, and all the sources are shown together, color-coded, on the paper text.
Page owner: Kathleen Luschek | Last updated 2020-May-19