NDSA:May 16, 2012 Standards Working Group Notes

From DLF Wiki
Revision as of 14:17, 21 May 2012 by Jjones (talk | contribs) (Created page with 'NDSA Standards and Practices Working Group Notes of 5/16/12 meeting 1:00 to 2:00 Eastern WebEx recording is here: https://issevents.webex.com/issevents/lsr.php?AT=pb&SP=MC&rID=…')
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

NDSA Standards and Practices Working Group Notes of 5/16/12 meeting 1:00 to 2:00 Eastern


WebEx recording is here: https://issevents.webex.com/issevents/lsr.php?AT=pb&SP=MC&rID=52776352&rKey=81818970e5b4ba12


Topic 1: Digital Preservation Staffing Survey

Meg Phillips said that a subgroup had met since the last Standards webex to finalize the questions in the Google docs version of the survey, but that Andrea had been working on the survey since then.

Andrea Goethals gave the working group a tour of the version of the survey she had created in Qualtrics survey software. She explained that all the questions were optional except the first one, which asks what type of organization the response is for.

Suggestions for improving the survey:

Q5 – suggest adding a column for “not included in our preservation function” or simply “not applicable,” which has the virtue of being shorter. This will allow us to see the different between skipped rows for another reason and a row that doesn’t apply.

Q4 – suggest spelling out FTE and perhaps providing an example

Q15 – suggest adding to the text of the “other” option something like “e.g. vault department,” which may be the only phrasing some respondents would recognize.

There was also a general feeling that the option “one or more archives that steward the collection” was confusing. By the end of the discussion, the question had been edited to “a library, archives, or other department that stewards the collection” [may not be quite right]

There was a comment that an organization like the MetaArchives Collaborative would find questions like 8 and 9 hard to answer, but the “other” options would provide enough of an opportunity for them to explain their real situation.

Q22 – we still need an e-mail address to use for position descriptions, org charts, etc. The sense of the group was that a .gov (or maybe .org) e-mail address would be important to lend credibility to the survey.

Jimi offered to explore the option of getting a Library of Congress e-mail address to use for this purpose. If Jimi gets stuck, Meg can investigate a NARA.gov address. Later we got the comment that (Meg didn’t catch the name and Jimi didn’t remember – please forgive us!) thought it would be easy to get an @nedcc.org address, so that’s also an option.

Andrea asked if the group thought we should request the name of the respondent’s organization. We wanted to get only one response per organization, but without asking this explicitly there will be no way to check. No-one had any objection to adding this question – the consensus was the people don’t mind providing this information. The best place to add the question may be at the very beginning before the current first question about the type of organization.

Kate Murray asked what we planned to do with the data collected? She pointed out that we should explain this in the introduction to the survey itself and/or the e-mail we will draft to invite people to take the survey.

The group agreed that the final report should be distributed widely, but that the individually identifiable responses would only be shared with members of the Standards Working Group.

The group discussed whether the fact that an organization responded to the survey should be widely shared. There were several opinions about whether this was desirable, but the suggestion to add a question that asked “Are you willing to have your responses identified by institution?” [this is what I wrote down during the meeting, but perhaps the sense was closer to “are you willing to be identified as a respondent in this survey if we do not identify any particular responses with your organization”?]

Action item: we need someone to draft the e-mail that will invite people to respond to the survey and an introductory paragraph about the purpose of the survey. Meg Phillips, Andrea Goethals, Jimi Jones, and Matt Schultz all volunteered to work on this.

Topic 2: Report on the National Association of Broadcasters Meeting – Linda Tadic*

Linda had provided written notes on the NAB meeting and she elaborated on her notes and answered questions about the meeting. The NAB meeting is one of two major trade shows for digital audio vendors and users. There were around 30,000 attendees.

Linda is also attending the “Screening the Future” conference, and she also offered to bring a report of that conference back to the working group.





  • Following are Linda’s notes:

NAB 2012 notes

Physical media

1. Only physical videotape seen on display was HDCAM in SONY booth.

2. Sony’s optical disc archive. Based on Blu-Ray discs. The cartridge holds 1.5 TB total (similar to LTO5’s compressed state). The archive cartridge holds 12 128 GB discs (1.536 TB). Does spanning. Stores any file format, not XDCAM-specific. LE: 50 years.

3. SONY XDCAM. 2 new versions: (1) Quad layer “archival.” 128 GB. Write once. (2) Triple layer 100 GB. Rewritable. Files are still wrapped in Sony’s proprietary MXF wrapper.

4. LTO5 LTFS. LTO drives only manufactured by HP and IBM, so all other manufacturers of drives are using one of these two.

What’s the difference between tar and LTFS?

Tar: faster. Unix-based so has command-line utility. Interoperability is an issue: if tape written on one deck is given to another deck to read out, must know information on originating deck to add to command line data.

LTFS: Facilitates interoperability/interchange of tapes.

MXF/JPEG2000 interoperability challenges

Ref. George Blood’s study on JPEG2000 files created on SAMMA and Digital Rapids: couldn’t play on each other’s players.

Front Porch Digital: Josef Marc said problem was in SMPTE MXF and JPEG2000 committees not talking to each other. Issue in MXF specification. Two committees working to resolve, and once finalized vendors can “fix” in their software.

Digital Rapids: Said problem was SAMMA: wasn’t proper MXF/JPEF2000. Not their problem.

“Archive” solutions

Increase in systems self-presenting as “archival,” similar to few years ago when most systems that indexed data in any way said they provided digital asset management.


Automatic QC

Digimetrics’ Aurora. Create profile for how a specific format/codec should be defined, and validates. Digital Rapids’ Transcode Manager will be integrating with Aurora; ready September.

USC Digital Repository repository.usc.edu

Sam Gustman started as off-shoot of Shoah project. 40 PB online storage. Redundancy on nodes. Files checked every 3 months.

Using Shoah system to catalog. Plans to license cataloging system as standalone product.

Clients have web access to files. Cloud storage: cost per GB, 1 TB minimum.

Cataloging: quoted per project Digitization: separate cost