Assessment:Analytics: Difference between revisions
No edit summary |
No edit summary |
||
Line 1: | Line 1: | ||
The DLF Analytics working group has been active since 2014. Early on, the group decided their goal would be to draft a set of high level metrics to track using analytics. The group eventually decided to scope their efforts around the widely - adopted Google Analytics service. The resulting white paper, "Best Practices for Google Analytics in Digital Libraries" is now available. | The DLF Analytics working group has been active since 2014. Early on, the group decided their goal would be to draft a set of high level metrics to track using analytics. The group eventually decided to scope their efforts around the widely - adopted Google Analytics service. The resulting white paper, "Best Practices for Google Analytics in Digital Libraries" is now available. | ||
== Abstract to: "Best Practices for Google Analytics in Digital Libraries == | |||
The purpose of this white paper is to provide digital libraries with guidelines that maximize the effectiveness and relevance of data collected through the Google Analytics service for assessment purposes. The document recommends tracking 14 specific metrics within Google Analytics, and provides library-centric examples of how to employ the resulting data in making decisions and setting institutional goals and priorities. The guidelines open with a literature review, and also include theoretical and structural methods for approaching analytics data gathering, examples of platform specific implementation considerations, Google Analytics set-up tips and terminology, as well as recommended resources for learning more about web analytics. The DLF Assessment Interest Group Analytics working group, which produced this white paper, looks forward to receiving feedback and additional examples of using the recommended metrics for digital library assessment activities. | The purpose of this white paper is to provide digital libraries with guidelines that maximize the effectiveness and relevance of data collected through the Google Analytics service for assessment purposes. The document recommends tracking 14 specific metrics within Google Analytics, and provides library-centric examples of how to employ the resulting data in making decisions and setting institutional goals and priorities. The guidelines open with a literature review, and also include theoretical and structural methods for approaching analytics data gathering, examples of platform specific implementation considerations, Google Analytics set-up tips and terminology, as well as recommended resources for learning more about web analytics. The DLF Assessment Interest Group Analytics working group, which produced this white paper, looks forward to receiving feedback and additional examples of using the recommended metrics for digital library assessment activities. | ||
Molly Bragg, Duke University Libraries | == White Paper Authors == | ||
Joyce Chapman, Duke University Libraries | |||
Jody DeRidder, University of Alabama Libraries | |||
Rita Johnston, University of North Carolina at Charlotte | * Molly Bragg, Duke University Libraries | ||
Ranti Junus, Michigan State University | * Joyce Chapman, Duke University Libraries | ||
Martha Kyrillidou, Association of Research Libraries | * Jody DeRidder, University of Alabama Libraries | ||
Eric Stedfeld, New York University | * Rita Johnston, University of North Carolina at Charlotte | ||
* Ranti Junus, Michigan State University | |||
* Martha Kyrillidou, Association of Research Libraries | |||
* Eric Stedfeld, New York University | |||
== Next Steps == | |||
The DLF AIG Analytics working group will present at the DLF 2015 session: "Collaborative Efforts to Develop Best Practices in Assessment: A Progress Report" on Monday, October 26 at 1:30pm. The session will be recorded and available [HERE] following the conference. There will also be an opportunity to learn more at the DLF Assessment lunch on Tuesday October 27. | The DLF AIG Analytics working group will present at the DLF 2015 session: "Collaborative Efforts to Develop Best Practices in Assessment: A Progress Report" on Monday, October 26 at 1:30pm. The session will be recorded and available [HERE] following the conference. There will also be an opportunity to learn more at the DLF Assessment lunch on Tuesday October 27. |
Revision as of 20:01, 28 September 2015
The DLF Analytics working group has been active since 2014. Early on, the group decided their goal would be to draft a set of high level metrics to track using analytics. The group eventually decided to scope their efforts around the widely - adopted Google Analytics service. The resulting white paper, "Best Practices for Google Analytics in Digital Libraries" is now available.
Abstract to: "Best Practices for Google Analytics in Digital Libraries
The purpose of this white paper is to provide digital libraries with guidelines that maximize the effectiveness and relevance of data collected through the Google Analytics service for assessment purposes. The document recommends tracking 14 specific metrics within Google Analytics, and provides library-centric examples of how to employ the resulting data in making decisions and setting institutional goals and priorities. The guidelines open with a literature review, and also include theoretical and structural methods for approaching analytics data gathering, examples of platform specific implementation considerations, Google Analytics set-up tips and terminology, as well as recommended resources for learning more about web analytics. The DLF Assessment Interest Group Analytics working group, which produced this white paper, looks forward to receiving feedback and additional examples of using the recommended metrics for digital library assessment activities.
White Paper Authors
- Molly Bragg, Duke University Libraries
- Joyce Chapman, Duke University Libraries
- Jody DeRidder, University of Alabama Libraries
- Rita Johnston, University of North Carolina at Charlotte
- Ranti Junus, Michigan State University
- Martha Kyrillidou, Association of Research Libraries
- Eric Stedfeld, New York University
Next Steps
The DLF AIG Analytics working group will present at the DLF 2015 session: "Collaborative Efforts to Develop Best Practices in Assessment: A Progress Report" on Monday, October 26 at 1:30pm. The session will be recorded and available [HERE] following the conference. There will also be an opportunity to learn more at the DLF Assessment lunch on Tuesday October 27.
As of the publishing of the Analytics white paper, the future of the DLF Analytics working group is uncertain. The group looks to the digital library community for feedback, ideas, and volunteers in order to continue. For those interested in continuing the effort, attend the DLF sessions listed above, post to the DLF AIG Google Group or contact Molly Bragg (molly.bragg at duke.edu) or Joyce Chapman (joyce.chapman at duke.edu) directly.
OLD PAGE CONTENT:
In December 2014, we began to draft a set of high-level types of data we want to capture with analytics, such as:
- Referrals
- Search terms
- Number of users
- Number of accesses
- Number of downloads
We also began to define a set of content types, as those may impact the capture of analytics. Examples include:
- Institutional repositories
- Licensed resources
- Digitized unique content
- Datasets
- Finding aids
- Websites
And we also realized that different audiences may need different information gathered. Here's our first draft of potential audiences:
- Administrators
- Content selectors
- Metadata providers
- System administrators
If you're interested in helping this subcommittee move forward, please join the [Digital Library Analytics Google Group] and speak up!