NDSA:Review options: Difference between revisions

From DLF Wiki
Jump to navigation Jump to search
Nancymcg (talk | contribs)
mNo edit summary
Nancymcg (talk | contribs)
mNo edit summary
Line 1: Line 1:
This activity will identify and describe self-assessment and audit options.  
The outcomes of this activity will be an annotated list of current self-assessment and audit options and basic guidance for organizing and completing a peer review audit with examples and other resources to demonstrate the compoentns of a peer review audit.  


== This activity will include: ==  
== This activity includes: ==  
* a summary of available options
* a summary of available options (e.g., TRAC-based self-assessmenyts, peer review audits, DRAMBORA reviews, Data Seal of Approval)
* updates from a project to define the scope of a peer review audit  
* documentation of examples to demonstrate how a peer review audit might work
* draft guidance for organizing a peer review audit using examples
* draft guidance for organizing a peer review audit using examples
* links to the community milestones and implementation examples, the other two activities in this project


== Related project(s): ==
== Related project(s): ==

Revision as of 12:53, 16 September 2013

The outcomes of this activity will be an annotated list of current self-assessment and audit options and basic guidance for organizing and completing a peer review audit with examples and other resources to demonstrate the compoentns of a peer review audit.

This activity includes:

  • a summary of available options (e.g., TRAC-based self-assessmenyts, peer review audits, DRAMBORA reviews, Data Seal of Approval)
  • documentation of examples to demonstrate how a peer review audit might work
  • draft guidance for organizing a peer review audit using examples
  • links to the community milestones and implementation examples, the other two activities in this project

Related project(s):

Drupal-based TRAC Review site Update on September 16, 2013: Artefactual has the current version of the Drupal-based TRAC Review site and they should be posting a downloadable and standalone version of the fiels for the site on the Artefactual site soon. These additions have been made:

  • Added an auditor role with a separate login and comment field so organizations can incorporate an audit loop into their repository management workflow once they have completed a self-assessment ro produce a baseline of their evidence of conformance for an external peer or other auditor to review
  • A natural language question field has been added so organizations can tailor questions to their environment to make the process easier for contributors
  • Courtney Mumma will be sending her examples of natural language questions for TRAC requirements that Archivematica addresses. She developed the questions for the two audits she is completing

Main Project Page: [1]