NDSA:Review options: Difference between revisions

From DLF Wiki
Nancymcg (talk | contribs)
m Created page with 'This activity will identify and describe self-assessment and audit options. A focus of this'
 
m 10 revisions imported: Migrate NDSA content from Library of Congress
 
(9 intermediate revisions by one other user not shown)
Line 1: Line 1:
This activity will identify and describe self-assessment and audit options.  
The outcomes of this activity will be an annotated list of current self-assessment and audit options and basic guidance for organizing and completing a peer review audit with examples and other resources to demonstrate the compoentns of a peer review audit.  


A focus of this
== This activity includes: ==
* a summary of available options (e.g., TRAC-based self-assessmenyts, peer review audits, DRAMBORA reviews, Data Seal of Approval)
* documentation of examples to demonstrate how a peer review audit might work
* draft guidance for organizing a peer review audit using examples
* links to the community milestones and implementation examples, the other two activities in this project
 
== Related project(s): ==
 
Drupal-based TRAC Review site
Update on January 27,2014: Artefactual has posted the current version of the Drupal-based TRAC Review site on their website[https://www.archivematica.org/wiki/Internal_audit_tool].  These additions have been made:
* Added an auditor role with a separate login and comment field so organizations can incorporate an audit loop into their repository management workflow once they have completed a self-assessment ro produce a baseline of their evidence of conformance for an external peer or other auditor to review
* A natural language question field has been added so organizations can tailor questions to their environment to make the process easier for contributors
* Courtney Mumma will be sending her examples of natural language questions for TRAC requirements that Archivematica addresses. She developed the questions for the two audits she is completing
 
Main Project Page: [http://www.loc.gov/extranet/wiki/osi/ndiip/ndsa/index.php?title=Audit_and_Certification:_Understanding_Options_for_Addressing_Standards_and_Requirements]

Latest revision as of 14:20, 11 February 2016

The outcomes of this activity will be an annotated list of current self-assessment and audit options and basic guidance for organizing and completing a peer review audit with examples and other resources to demonstrate the compoentns of a peer review audit.

This activity includes:

  • a summary of available options (e.g., TRAC-based self-assessmenyts, peer review audits, DRAMBORA reviews, Data Seal of Approval)
  • documentation of examples to demonstrate how a peer review audit might work
  • draft guidance for organizing a peer review audit using examples
  • links to the community milestones and implementation examples, the other two activities in this project

Related project(s):

Drupal-based TRAC Review site Update on January 27,2014: Artefactual has posted the current version of the Drupal-based TRAC Review site on their website[1]. These additions have been made:

  • Added an auditor role with a separate login and comment field so organizations can incorporate an audit loop into their repository management workflow once they have completed a self-assessment ro produce a baseline of their evidence of conformance for an external peer or other auditor to review
  • A natural language question field has been added so organizations can tailor questions to their environment to make the process easier for contributors
  • Courtney Mumma will be sending her examples of natural language questions for TRAC requirements that Archivematica addresses. She developed the questions for the two audits she is completing

Main Project Page: [2]