Assessment:Metadata
DLF Assessment Interest Group: Metadata Working Group
(Metadata Assessment Working Group)
Broad Aims + Description of the Metadata Assessment Group within DLF AIG
This group is open to anyone who is interested in taking part. There are no minimum or membership requirements, though we hope you’ll take as active a part as you’re able and willing.
The Metadata Assessment Working Group aims to build guidelines, best practices, tools and workflows around the evaluation and assessment of metadata used by and for digital libraries and repositories. The group hopes to offer deliverables and recommendations that digital library and repository users can implement for metadata assessment and metadata quality control.
This group should not focus on prescriptive recommendations for metadata (using one schema versus another, or comparisons of controlled vocabularies, etc.), but rather how (both functionally and to what metrics) to measure, evaluate and assess the metadata as it exists in a variety of digital library systems. Future work could lend itself to then using that assessment work to decide a pathway for metadata enhancement. This framing is very much influenced by the flurry of interest, activity and discussion around metadata assessment that occurred just after DPLAfest 2015.
Metadata WG Scope & Aims in 2016
Beyond the preliminary focus (foci) given above and by the DLF AIG, the goals, aims, and deliverables of the Metadata Working Group will be decided in the kick off meeting by participants. Some possible areas of work include (but are not limited to):
- building framework around a moving targeting of defining metadata ‘quality’ and how to support flexibility there in tandem with metadata assessment recommendations
- definitions & metrics for metadata assessment (including some possibilities for that below, please expand/define/redefine as you see fit):
- metadata quality as it appears in different contexts (local repository vs harvested index)
- record completeness (how much descriptive metadata is enough?)
- metadata consistency (internal, within a collection vs external with conformance to common authority sources)
- metadata interoperability: a function of quality, completeness, consistency: metadata generated for one specific system/use case needs to be usable in other systems/use cases
- How to refine above for:
- value for direct human usage
- value for indirect software usage
- use cases for metadata assessment functionalities
- tools that exist for metadata assessment - including how to use and understand the results
- agile or responsive workflows that involve metadata assessment
We hope to have a number of metadata assessment deliverables to offer up to the community and larger AIG group at the end of 2016 - in time for the 2016 DLF Forum. These deliverables will be determined and grow out of the Metadata Assessment Working Group interest, in particular, the goals and aims set at the kick off meeting.
Ongoing Metadata Assessment Working Group Communication
The Metadata Working Group will primarily communicate through the DLF AIG Metadata Working Group Google Group. There will also be regularly scheduled virtual meetings, the frequency and timing of which will be decided at the kick off meeting on January 29, 2016.
We will also have a gathering at the 2016 DLF Forum (attendance to this in person meeting is not required for folks to be involved in this group).
All the work will be done in an open documentation space - probably Google Docs - which we invite anyone to comment on or edit. The links will be shared here and on the Google Group as work begins.
This Wikipage will be used to capture definite information - meeting times, decided deliverables and goals, etc.
General Group Timeframe
- December 2015: Call for all people interested in participating in this group.
- December 2015 - Mid January 2016: All interested participants should join the Google Group to then be invited to edit shared documentation (Google documents, like this document) bringing together existing metadata assessment tools, work, resources, use cases, methods, etc.
- 29 January 2016, 1 PM EST: The first virtual meeting of the group, through Webex (information at top of this document). At this first meeting, more specific goals will be decided.
- February 2016 - November 2016 (DLF 2016): The group will hold regular virtual meetings (the frequency of which will be decided at the first call) for working towards the first round of metadata assessment goals and deliverables.
- November 2016: Meeting at DLF 2016 in Milwaukee (attendance not required to be a part of this group!), as well as next round of planning for metadata assessment meetings and deliverables.
Meetings
2016-01-29 - Kick Off Meeting
Friday, January 29, 2016 1:00 pm | Eastern Standard Time (New York, GMT-05:00) | 1 hr
Join via Web: https://cornell.webex.com/cornell/j.php?MTID=m40e1e156337960930736db62afbf4091 Webex Meeting number: 649 329 292 Meeting password: Metadata1
Join by phone: 1-855-244-8681 Call-in toll-free number (US/Canada) 1-650-479-3207 Call-in toll number (US/Canada) Access code: 649 329 292
Agenda - Built off of this document: http://bit.ly/metadataAssessment
- Review of preliminary/broad aims of this group within the context of DLF Assessment Interest Group
- What is DLF Assessment Interest Group
- Who can join/when/for what amount of work/etc.
- Broad timeframe
- Introductions
- Name
- Briefly, interest in this group/any deliverables you hope for
- Scope & Deliverables of this Group for 2016
- Definitions of terms used
- Based off discussion in the Google doc (http://bit.ly/metadataAssessment) and the introductions
- Decide on top 2 to tackle, what deliverables need to come from that
- Timeframe
- Logistics
- Working space going forward
- Communication
- Regular meetings - decide on frequency
- Questions, outstanding items