NDSA:Staffing survey planning page: Difference between revisions
Meg.phillips (talk | contribs) |
m 38 revisions imported: Migrate NDSA content from Library of Congress |
||
(One intermediate revision by one other user not shown) | |||
Line 90: | Line 90: | ||
** Survey tool (How well did it support ability to formulate questions, preview surveys, perform analysis, export data for archiving, etc.?) | ** Survey tool (How well did it support ability to formulate questions, preview surveys, perform analysis, export data for archiving, etc.?) | ||
***I thought it was incredibly powerful - much better than other similar survey tools I've used. | ***I thought it was incredibly powerful - much better than other similar survey tools I've used. | ||
***It has a lot of great features - I had never used anything this powerful before. I wish the graphs it produces out-of-the-box were more legible though (higher resolution, not cutting off text, etc.). | |||
** Announcement/recruitment method (targeted audiences, timing, announcement tools (listservs, blogs), etc.) | ** Announcement/recruitment method (targeted audiences, timing, announcement tools (listservs, blogs), etc.) | ||
*** We might want to be more targeted in recruiting people to fill it out on behalf of their institutions, especially since we're saying we only want one response per institution. Get someone in NDSA to commit to answering for their whole organization. | *** We might want to be more targeted in recruiting people to fill it out on behalf of their institutions, especially since we're saying we only want one response per institution. Get someone in NDSA to commit to answering for their whole organization. | ||
Line 99: | Line 100: | ||
** (if you participated in the analysis) Were there questions that were difficult to analyze for any reasons? | ** (if you participated in the analysis) Were there questions that were difficult to analyze for any reasons? | ||
***The type of institution question generated some strange results - we were inclined to change some institutions' answers so make things more consistent, which probably means that the categories we offered weren't clear or mutually exclusive. | ***The type of institution question generated some strange results - we were inclined to change some institutions' answers so make things more consistent, which probably means that the categories we offered weren't clear or mutually exclusive. | ||
***The FTEs question needs to be reworked in general. The way it was done you can't distinguish between someone explicitly giving an FTE value of 0 vs. someone leaving it blank. | |||
== Publish and archive teams == | == Publish and archive teams == |
Latest revision as of 15:18, 11 February 2016
Project Origin
Excerpt from Meg's email that started this project:
"Recently I've had several interesting conversations with people at other institutions about how the work of digital preservation is and should be organized and staffed within organizations. We've touched on issues like: what different roles are necessary to do digital preservation, what is the division of labor, what kinds of skills are needed, are developers within a digital preservation unit or outside it, would people be willing to share org charts and position descriptions, and on and on.
Information like this could be useful for many reasons, including simply finding out how many different ways of doing this there are, benchmarking, identifying effective practices, and maybe making a case to strengthen staffing at our own institutions.
This topic clearly isn't related to "standards", but it probably is related to "practices." I was wondering if other members of this group would have any interest in conducting a survey of institutions to find out how they organize and staff the digital preservation function, and how they would like to organize and staff it. We could do something relatively simple, on the model of the storage survey recently conducted by another NDSA working group."
NDSA:List of places/people we want to receive the survey
NDSA:Survey-related action teams and timeline
Use Cases
In order to get a handle on what kinds of questions we'd like to ask, we should consider what we want to do with this survey. As you come up with questions and insert them in the section below, think about use cases for this survey and put them in this section.
- An institution wants to compare their digital preservation staffing levels and functions with those of other similar organizations.
- An institution is planning to implement a digital preservation program and wants to know what is both typical and considered ideal for staffing.
- A person wants to advocate for additional digital preservation resources to upper management and needs data about typical and ideal staffing levels.
Notes
Section 3.2 of the latest version of the TRAC successor is about Organizational Structure and Staffing. Specifically it requires the following:
- 3.2.1 The repository shall have identified and established the duties that it needs to perform and shall have appointed staff with adequate skills and experience to fulfill these duties.
- 3.2.1.1 The repository shall have identified and established the duties that it needs to perform.
- 3.2.1.2 The repository shall have the appropriate number of staff to support all functions and services.
- 3.2.1.3 The repository shall have in place an active professional development program that provides staff with skills and expertise development opportunities.
Should the survey start with a definition of digital preservation? If so, which one?
- from http://www.digitalpreservation.gov/about/:
- "Digital preservation is the active management of digital content over time to ensure ongoing access"
- from http://en.wikipedia.org/wiki/Digital_preservation:
- "Digital preservation is the set of processes, activities and management of digital information over time to ensure its long term accessibility. The goal of digital preservation is to preserve materials resulting from digital reformatting, and particularly information that is born-digital with no analog counterpart."
- from http://www.digitalpreservation.gov/about/:
Draft questions
Survey Introduction
This survey is intended for institutions that currently preserve content in-house or plan to in the near future. Only one response should be submitted per institution.
Questions about the Institution
- Which of the following most closely describes the type or function of your organization? [Pull-down list: archives, historical society, library, museum, research group, other]
- Which of the following most closely describes the sector of your organization? [Pull-down list: academic, corporate, local government agency, state government agency, federal government agency, other]
- Is preservation the principle mission of your organization? [radio button: yes, no]
- (Not sure if we need this question if we only ask this to those preserving content, and we can infer their overall mission by the kind of institution it is)
- Approximately how many people work within your organization? [Free text box]
- Approximately, how many terabytes of storage space do you require for all copies of your content that you manage? [Free text box]
Questions about Current Practices
- Are the staff who do digital preservation in your organization centralized within one unit or are they scattered throughout your organization? [radio button: centralized, scattered throughout, not applicable]
- If they are centralized, does that group include IT staff (e.g. software developers, system administrators)? [radio button: yes, no, not applicable]
- How many people in your organization do digital preservation work either full or part time? [free text box]
- What are the titles of the people who do digital preservation work in your organization? [free text box]
- What department(s) are they in? [free text box]
- What are the major functions you include in your organization's digital preservation work at present? Select all that apply. [checkboxes: selection, digitization and processing, metadata creation, creation of derivatives and access copies, deposit to storage, fixity checks, file format verification, auditing, migration, refreshing, emulation, creation and maintenance of software tools, research, preservation planning]
- Did you hire digital preservation specialists or "grow your own" (or both)? [radio button: hire, grow your own, both]
- How much work experience, in years, do most of your digital preservation staff have? [free text box]
- What skills do you think are necessary in the people who perform these functions? [free text box]
- What educational background do you think is necessary to perform these functions? [free text box]
- Where do you turn for professional development opportunities for your digital preservation staff? [free text box]
- What types of professional development opportunities have you used? [free text box]
Questions about the Ideal Practice
- Do you think that the number of people you have working in digital preservation is sufficient? [radio button: yes, no]
- In what areas are you missing expertise that you need for digital preservation? [free text box]
- Is your digitization setup right for you at the moment? [radio button: yes, no]
- How many people do you think you need for digital preservation? [free text box]
- What are the major functions you wish to include in your organization's digital preservation work in the future? [checkboxes: selection, digitization and processing, metadata creation, creation of derivatives and access copies, deposit to storage, fixity checks, file format verification, auditing, migration, refreshing, emulation, creation and maintenance of software tools, research, preservation planning]
- What do you think the ideal functions, structure, and staffing of a digital preservation function would be (setting aside anything you actually have or are likely to have)? [free text box]
More Information
- Do you have org charts, mission statements, position descriptions that you'd be willing to share? [radio button: yes, no]
- If they are on-line, what is the URL(s)? [free text box]
- If they are not on-line please email them to X.
Lessons Learned
Please add anything here that you think we should do differently when we repeat this survey in a few years. It could be different audiences to target, different ways to announce the survey, rewording of questions, additional questions, etc.
- Methodology
- Survey tool (How well did it support ability to formulate questions, preview surveys, perform analysis, export data for archiving, etc.?)
- I thought it was incredibly powerful - much better than other similar survey tools I've used.
- It has a lot of great features - I had never used anything this powerful before. I wish the graphs it produces out-of-the-box were more legible though (higher resolution, not cutting off text, etc.).
- Announcement/recruitment method (targeted audiences, timing, announcement tools (listservs, blogs), etc.)
- We might want to be more targeted in recruiting people to fill it out on behalf of their institutions, especially since we're saying we only want one response per institution. Get someone in NDSA to commit to answering for their whole organization.
- Survey tool (How well did it support ability to formulate questions, preview surveys, perform analysis, export data for archiving, etc.?)
- Survey questions (see Copy of survey)
- (if you took the survey)Were there questions that were confusing or onerous to answer?
- Q10-12 were difficult to answer because the selections from 10 carried through to 11-12. Something may not have been "in scope" [in practice] but it might be an important part of digital preservation. For example, just because it is not in scope (Q10) does not mean that we would not be interested in a vendor assisting (Q12) with certain services.
- We had trouble with this whole zone, too. We do most of these activities, but they aren't done in a unit called "digital preservation." The line between activities we do and think of as digpres and those we do but don't define as part of digpres is not clear. That made all these questions hard. Q15 about the FTE with different titles doing digpres was nearly impossible - partly because we have people doing those functions who don't thin k of themselves as doing digpres, and partly because we weren't sure how to map our real titles onto the categories given.
- Were there questions you wished we had asked but didn't?
- (if you participated in the analysis) Were there questions that were difficult to analyze for any reasons?
- The type of institution question generated some strange results - we were inclined to change some institutions' answers so make things more consistent, which probably means that the categories we offered weren't clear or mutually exclusive.
- The FTEs question needs to be reworked in general. The way it was done you can't distinguish between someone explicitly giving an FTE value of 0 vs. someone leaving it blank.
- (if you took the survey)Were there questions that were confusing or onerous to answer?
Publish and archive teams
(Please edit this section to sign up for one of these teams - or send an email to Andrea or the group.)
Team to create the final report: Meg Phillips, Andrea Goethals, Carol Kussmann, Mary Vardigan
Team to archive the survey data: Winston Atkins, Mary Vardigan