NDSA:December 15, 2014 Standards and Practices Working Group Notes

From DLF Wiki
Revision as of 15:20, 11 February 2016 by Dlfadm (talk | contribs) (1 revision imported: Migrate NDSA content from Library of Congress)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Minutes from December 15, 2014 Standards & Practices Working Group Call

Compiled by Joey Heinen, Harvard Library


Announcements and Project Updates

  • Upcoming e-mail archiving project and tool demos: 1/7/2015 2 pm EST, 1/21/2015 2 pm EST

(Including Stanford University Library’s ePADD project, among others)


Main Discussion – Progress Report on Stumbling Blocks Video Survey Results

The survey was conducted from early July to early August in 2014. The idea emerged out of the Standards Practices group discussions, noticed that many institutions voiced concerns over the challenges of preserving video content, noting that it is often the last content type to be easily ingested into most repositories. The survey aimed to get to the bottom of what the core challenges are related to video preservation (both analog and digital).

The survey was broken into 5 sections and distributed over survey monkey, the results varied in terms of responses and demographics of respondents (representatives from government, state, and special libraries and some international reach – Canada, Germany, Australia, New Zealand, South Africa, etc.).

Winston Atkins discussed the analysis of institutional challenges relative to budgeting and staffing. There is still a question of metrics and how to analyze the data in terms of respondent types and their institutional context. They are currently considering models for creating institution groupings, one such model being the Carnegie Classification of Institutions of Higher Education (now overseen by Indiana University).

Kate Murray added that the survey did not cover formats and numbers of those formats within each institution/collection and that this may be a consideration for future surveys.

Andrea posed the question as to whether the survey respondents might be matched up against the NDSA members list to gauge the level of engagement with the Alliance and future conversations within the Standards Group. Acquiring more user narrative might be useful for future surveys since a “Tell us About Yourself” section was optional and received limited replies.

Andrea Goethals next talked about the identified challenges which institutions were facing in their video preservation efforts and how the results were weighted. The question considered 14 challenges and asked respondents to rank those choices from 1 to 14 with 1 being the highest priority. Higher prioritities were given a weighted score so as to fully emphasize areas of greatest identified need. Any priorities for which an institution had identified as “problem solved” were given a negative value to offset this need.

The results were placed in a spreadsheet, final values with weighting were tabulated and a column was added for identifying the number of institutions that had solved each problem. “Developing Workflows” was identified as the highest priority, “Securing Storage” and “Funding resources” also high. Andrea noted that these results were interesting as they reflected practical considerations.

-For “Problems Solved,” users most identified “Management buy-in” and “knowing where to start” as solved issues, “Workflow Challenges” were not frequently solved nor was “support/technical guidance” however this final criteria was also not ranked high on the priorities. For the most part results were fairly consistent with highest-needs challenges also ranking low on the “solved” list.

-John Spencer asked whether this survey gathered “institutional” results or if each institution had only one representative/respondent. Multiple submissions were accepted within institutions and IP addresses but some “true” duplicates were removed where the same person had submitted results more than once (a couple dozen instances).

Kate Murray next discussed issues that focused on born-digital video. Interestingly enough, 73% stated analog video as a challenge whereas 42% stated born-digital as an issue and 34% other media. This was somewhat contrary to her expectations given the complexity and variety of born-digital formats in existence.

51 respondents were involved in the Born-Digital Video (BDV) survey and similar weighting was applied as that found in Andrea’s results. Results were also filtered where specific similar/differing criteria were compared against one another. “Workflow Challenges” were again consistent as identified challenges and infrequently marked as “sovled,” with “knowing where to start” again most-often identified as solved.

For institutions that solved high-priority issues:

  • Workflow: Avalon, end-to-end (1 answer)
  • Repository Services: Cloud-based storage
  • Digital storage: cloud-based storage
  • Confidence in file formats (preserves all “as is,” stay natively with DV for reformatting (BT.601), considering normalization, already using ProRes 422 files from some producers
  • then compared against all respondents, main areas of difference are funding-resources (higher for born-digital) and repository services (higher for all respondents rather than born-digital)

Hannah Frost was not able to call in to the meeting so Kate spoke on her behalf. For institutions that were NOT challenged by lack of storage and repository services, these related criteria were filtered and ranked in positions 10-14. Then a heat map was applied to the top 1-5 rankings for these groups with orange-red as higher needs and green as low needs. “Technical guidance on video formats” and” appraisal and selection” were higher priorities. Other areas that ranked on the higher side for some respondents within this grouping were “Knowing where to start,” “funding,” “appraisal and selection,” “format guidance,” “video expertise,” and “metadata, workflows.”

Lisa Snider presented the filtered results for question 3-8 (expertise), 9 (technical/formats guidance), and 10 (metadata/standards guidance) and how these results related based on respondents with similar answers. She checked to see if there was any correlation between these answers in terms of solving the problem, looking at both raw and weighted data. “Technical guidance” was commensurate between groups (27% of the time) but “metadata” and “formats” had more limited correlation (15% of the time).

A final discussion involved how this data be made available, especially given that anonymity was promised at the outset of the survey and this would need to be maintained.

Additionally, there was a question as to how these survey results could be brought to a “milestone” point and summarized so as to get broader buy-in from deciding/funding bodies and external groups. Andrea Goethals thought that this will need to be more fully processed and condensed as a report before considering an outreach approach for the study.


Future topics for NSDA Standards Practice Meetings

  • Follow-up to video, perhaps discussing best practices in workflows and formats (perhaps recruiting from some survey respondents)
  • Guest presenters from NLNZ and NARA to talk about DPTR (Digital Preservation Technical Registry, a formats and software environments registry), potentially on Jan 26th
  • Email archiving demos
  • Feb 23rd was proposed as the February meeting but a topic is yet to be determined, input is encouraged.