Published on July 10th, 2012 | by Adrian Stevenson
0Criteria Matrix for Identifying and Prioritising Discovery datasets.
(Image © IWM (Q 28286))
There’s an accompanying table for Adrian’s Timeline and Workplan post which I have attached here.
Criteria | Required | Desirable | Not required | Rationale |
Has the institution adopted an explicit open licensing statement plus clear and reasonable T&C |
X |
For the purposes of the However, if it is |
||
Have the adopted
|
X |
In addition to adopting common standards, the quality and comprehensiveness of the metadata records must be assessed. A baseline for quality assessment will also be developed. |
||
Are they deploying
|
X |
This is highly desirable as a means to promote awareness and best practice, but we recognise that it may not be feasible to require this. Quality metadata sets with persistent IDs should be prioritised. |
||
Are they
|
X |
Highly desirable, and
|
||
Do they provide a
|
X |
Fundamental. If there is no API, then the ability of the exemplar to demonstrate the Discovery ecosystem ‘at work’ is not possible. |
||
Are the APIs well
|
X |
Highly desirable, but Mimas developer could likely invest the effort required if the documentation is not there. |
||
Do they adopt widely
|
X |
Can be assessed on a case by case basis. | ||
Is the API regularly
|
X |
Not necessary for this project, but must be promoted as a fundamental part of any sustainability plan. If this exemplar were to become a service, then this is critical. |
||
Are service
|
X |
See above | ||
Are they using their
|
X |
Hallmark of best practice in sustainability. But see above. |
||
Is data being
|
X |
Not necessary for this project, but we might want to consider asking participating institutions to track API usage on their end |
||