NDSA:Review options: Difference between revisions
mNo edit summary |
m 10 revisions imported: Migrate NDSA content from Library of Congress |
||
(2 intermediate revisions by one other user not shown) | |||
Line 1: | Line 1: | ||
The outcomes of this activity will be an annotated list of current self-assessment and audit options and basic guidance for organizing and completing a peer review audit with examples and other resources to demonstrate the compoentns of a peer review audit. | |||
== This activity | == This activity includes: == | ||
* a summary of available options | * a summary of available options (e.g., TRAC-based self-assessmenyts, peer review audits, DRAMBORA reviews, Data Seal of Approval) | ||
* | * documentation of examples to demonstrate how a peer review audit might work | ||
* draft guidance for organizing a peer review audit using examples | * draft guidance for organizing a peer review audit using examples | ||
* links to the community milestones and implementation examples, the other two activities in this project | |||
== Related project(s): == | == Related project(s): == | ||
Drupal-based TRAC Review site | Drupal-based TRAC Review site | ||
Update on | Update on January 27,2014: Artefactual has posted the current version of the Drupal-based TRAC Review site on their website[https://www.archivematica.org/wiki/Internal_audit_tool]. These additions have been made: | ||
* Added an auditor role with a separate login and comment field so organizations can incorporate an audit loop into their repository management workflow once they have completed a self-assessment ro produce a baseline of their evidence of conformance for an external peer or other auditor to review | * Added an auditor role with a separate login and comment field so organizations can incorporate an audit loop into their repository management workflow once they have completed a self-assessment ro produce a baseline of their evidence of conformance for an external peer or other auditor to review | ||
* A natural language question field has been added so organizations can tailor questions to their environment to make the process easier for contributors | * A natural language question field has been added so organizations can tailor questions to their environment to make the process easier for contributors |
Latest revision as of 14:20, 11 February 2016
The outcomes of this activity will be an annotated list of current self-assessment and audit options and basic guidance for organizing and completing a peer review audit with examples and other resources to demonstrate the compoentns of a peer review audit.
This activity includes:
- a summary of available options (e.g., TRAC-based self-assessmenyts, peer review audits, DRAMBORA reviews, Data Seal of Approval)
- documentation of examples to demonstrate how a peer review audit might work
- draft guidance for organizing a peer review audit using examples
- links to the community milestones and implementation examples, the other two activities in this project
Related project(s):
Drupal-based TRAC Review site Update on January 27,2014: Artefactual has posted the current version of the Drupal-based TRAC Review site on their website[1]. These additions have been made:
- Added an auditor role with a separate login and comment field so organizations can incorporate an audit loop into their repository management workflow once they have completed a self-assessment ro produce a baseline of their evidence of conformance for an external peer or other auditor to review
- A natural language question field has been added so organizations can tailor questions to their environment to make the process easier for contributors
- Courtney Mumma will be sending her examples of natural language questions for TRAC requirements that Archivematica addresses. She developed the questions for the two audits she is completing
Main Project Page: [2]