NDSA:July 20, 2015 Standards and Practices Working Group Notes

From DLF Wiki

Announcements and Project Updates

  • iPres poster accepted that discusses the work of the group. Will work on layout and details in coming months.
  • NDSA will have a reception at iPres, which is also where the Innovation Awards will be presented.
  • Video Deep Dive project – will have an upcoming blog post after Erin gets the draft to the working group to review.
  • Email Archiving Symposium: A small working group is updating a chart that originated with Harvard that visually displays the capability of different tools that work with emails and how/where actionable during the life cycle. They are updating this to add in the digital curation life cycle as well as background information on the tools - things like how complicated are the tools, what operating system is required, set up... This new version would then be added to something like COPTR. The goal is to have this out before SAA. If you want to be added to the email list for the email group, contact Kate.

Update on New NDSA Hosting

  • 5 letters of interest from 5 institutions. The coordinating committee looked at these this morning and plan to interview interested institutions over the next two weeks and make a recommendation by Mid-August. The Coordinating Committee is trying to figure out how working groups can contribute to this process.


Today's Topic of Discussion: Upcoming Conference Presentations / What Are You Working On

  • Kate Murray: Presenting at IASA (http://www.2015.iasa-web.org/) with others [? Chris Lacinak and Carl Fleischhauer] on Performance of Analog-to-Digital Converters for Sound: Revisiting the Performance Methods and Metrics (FADGI Project). This project builds on work dating back to 2011 when began looking at metrics and saw the possible need to modify the metrics/benchmarks. Tests have been run with various equipment on the metrics/benchmarks from which the results are being used to make recommendations in final reports. Other documentation has come out of the project from using a $25,000 machine and a $600 machine. Ask Kate for more details on the standards and original reports. Two of the projects reports can be found here: http://www.digitizationguidelines.gov/guidelines/digitize-audioperf.html
  • Andrea Goethals: Will be presenting with Joey Heinen at iPres on Developing a Framework for File Format Migrations. This is based on a paper Joey did as the NDSA resident at Harvard. In general, it was known that formats needed to be migrated, but instead of focusing on one format to another, Joey developed the Migration Framework which consists of 5 general steps (each with their own tasks and deliverables) to work through for migrating any format to another. A case study was done using Kodak Photo CDs as well as Real Audio files. Documentation on the Kodak CD process is included in the paper. Some examples of generic steps include:
    • Format analysis: what issues might need to be addressed for each format involved to ensure content remains the same after migration (ie. unique color spaces or compression)
    • Content grouping: determine how many different groups are necessary to get the best results for each format type. Each group may go through a different process.
    • Migration environment: Determine what tools are needed for migration, what environment is needed... etc.
    • Stakeholder chart: documenting who plays what role
  • Mariella Soprano: Has been working to form a Digital Preservation Strategy Group to assist with working with DPN. The Archives and Special Collections have a lot of materials in audio or video formats with only a single copy as well as materials on hard drives that have not been transferred to a server yet. These materials will be key candidates for utilizing DPN, however there is a lot of work to be done prior to ingest. Islandora has been adopted as the management system.
  • Amy Kirchhoff: Focus for Portico has been to assist with collection management for libraries. Interested in how present data back to institutions. They have been working on the back end to better analyze data as well as addressing the audit interface to help institutions make their efforts for collection management easier.