v32#1 Data Privacy and Information Literacy Assessment: A Case Study

by | Apr 1, 2020 | 0 comments

by Russell Michalak  (Director of the Library & Learning Center, Goldey-Beacom College) 

and Monica D. T. Rysavy  (Director of Institutional Research and Training, Goldey-Beacom College) 

Introduction

Collecting Personal Identifiable Information (PII), in the context of services and programs academic libraries offer is a topic that continues to be of interest to the library and greater academic community.  Patron data, specifically those data points associated with use of services, programs, systems, and library instructional / training efforts has the potential to show academic libraries’ value as these data points can assist with illustrating “student success in the form of outcomes attainment, retention to completion or graduation, and post-graduation career placement and earnings” (Oakleaf, Whyte, Lynema, & Brown, 2017).  

However, there has been a backlash among librarians regarding using PII in academic libraries due to privacy concerns (Jones and Salo, 2018).  In this article, we share how PII is collected in our current information literacy assessment (ILA) program, how this data is used, and who the data is shared with.  This article also shares the benefits of academic libraries establishing partnerships with other academic departments such as our long-standing relationship with the office of institutional research & training (OIRT).

Background

What the Library Used to Do

Pre-2011

Nine years ago, I (Russell) joined Goldey-Beacom College — a small private college (www.gbc.edu) in the Northeast.  One of my first tasks was to transform the library’s research and instruction program.  Prior to being hired, librarians invited English faculty to bring students (who were generally enrolled in 100 level courses) into the physical library for a tour.  The librarians walked the students around the library to point out the location of the library’s resources (print books, CD-ROM databases, microfilm). In addition, librarians walked the students around the library to show them the computers and printers as well as where they could study collaboratively on group projects or in quiet study area.  There was no formal tracking (i.e., no attendance was taken) nor was there a formal assessment component to this program (i.e., no evidence of knowledge gains as a result of this “library visit experience”). However, librarians did typically reach out to faculty who expressed interest each subsequent year to see if they wanted to set up another library tour. 

Upgrading Our Information Literacy Assessment Program

2011-2015

During my first few months at the college, I interviewed key faculty members about their expectations for the future of the ILA program.  They shared that they wanted students to be trained to navigate the electronic resources, develop a topic, evaluate information, and cite articles.  Many faculty members who requested information literacy training wanted librarians to teach students not to plagiarize. 

I learned during these meetings that faculty were hesitant to give up any class time, so we agreed to limit the IL training sessions to one 45-minute session.  This type of session is often referred to in the library literature as “one-shot information literacy sessions” (Mery, Newby, & Peng, 2012). While prepping the content for these one-shot IL training sessions that would teach the topics faculty requested, we ensured the concepts taught aligned with the ACRL Information Literacy standards (2004). 

To assess students’ understanding of the concepts taught post-training, librarians distributed a nine-question printed survey with multiple-choice questions that assessed the topics presented during the one-shot IL training session.  No PII was requested in this post-training assessment (which surprised the students — we had many ask “don’t you want our names?”), however, to our surprise many students shared their names on the assessment anyway. 

2015-Present

After a few years of delivering one-shot IL sessions in faculty’s classes with increasingly minimal student engagement we looked for a way to further iterate our IL efforts.  Around the same time, our current college president charged me with creating an internal IL instrument that would assess the IL skills of our student population. I worked with Monica and her OIRT staff to develop the college’s first formal IL assessment, which we called the ILA.  This assessment included a questionnaire with 5 self-efficacy questions (we named this questionnaire the “SPIL-Q” or “Students’ Perceptions of their Information Literacy Skills” (Michalak, R., & Rysavy, M.D.T., 2016) and a 60 multiple-choice question assessment that directly mapped to the ACRL information literacy standards (2004).  The college president heavily encouraged us to collect copious amounts of data throughout the IL redesign process and to publish our findings.

We first distributed the ILA online to 932 international graduate students enrolled for at least one course during the summer 2015 semester using a Google Form created by OIRT.  We chose this population because it was the largest in the college at the time and this population was traditionally underserved by the library because the graduate faculty did not request IL instructional sessions.  The assessment questions portion of the form were graded for correct/incorrect responses using the free plugin-tool Flubaroo (http://www.flubaroo.com).  This assessment served as a formal benchmark that we shared with faculty to encourage them to use our ILA program.  Across all ACRL standards, the college’s international graduate students (n=172) scored below the graduate school’s academic standard of 70%, on average scoring 53%.  We learned female international graduate students felt more confident about their information literacy skills than male international graduate students and they performed better (Michalak, Rysavy, & Wessel, 2017).  We shared summarized results from this first deployment with the college community (by division — UG/GR, by gender, and by major) in various marketing communications and created a variety of data visualizations and dashboards created in Tableau in an effort to market librarians’ IL training sessions.  We shared summarized results from this first deployment with the college community (by division — UG/GR, by gender, and by major) in various marketing communications and created a variety of data visualizations and dashboards created in Tableau in an effort to promote librarians’ IL training sessions. 

Our president also charged us with designing online training modules pre and post-quizzes to measure students’ IL learning outcomes.  With the knowledge gained from the results of our first IL assessment deployment, Monica’s instructional design background, and input from members of the IL team we developed from our ILA program with the following components: 

• SPIL-Q Assessment

• A pre-test with 60 multiple-choice questions (10 questions per module)

• Six distinct online training modules that provided instruction directly to the 5 ACRL standards of (ACRL, 2004) for information literate students (standard 5 was split into six online training modules) with post-training module quizzes (10 questions each)

• A post-test with the same 60 multiple-choice questions 

We designed the ILA program for four distinct audiences: undergraduate first-year composition courses, the senior capstone English course, incoming graduate students, and incoming doctoral students.  Undergraduate first-year composition course students complete the ILA as a required part of the course curriculum. Senior-level students in the capstone English course are encouraged to complete the ILA, but it’s optional.  Incoming graduate and doctoral students are invited to complete the ILA during the first semester they are enrolled at the college, however, it’s also optional for this population. We have made efforts to embed the ILA in a more required / substantial matter for the latter two populations, but those are still in progress. 

What Types of Identifiable Data Do We Collect?

For the Pre-Test, Post-Module Quizzes, and Overall Post-Test

• Single-Sign-On through Qualtrics collected each participant’s first name, last name, and college e-mail address automatically. 

Students were asked the following identifiable questions:

• Student ID Number

• Type of Student (Boot Camp, UG, GR, DBA)

• Student Athlete Status (Yes/No)

• Gender

• Age (18-24, 25-34, 35-44, and 45 or older)

Students are asked to enter their name (first and last) and college email for each of the training modules. 

All of the ILA program components, pre-test, training modules, post-training quizzes, and post-test collect a date and time stamp for the completion of each component and a duration time stamp — i.e., how long did it take each student to complete each component?

Other identifiable data points were sourced from the Office of Institutional Research & Training, including: 

• Student’s major

• Student’s overall (to-date) GPA

• Student’s ethnic status

• Student’s Citizenship status

• Student’s ENG course section code (if applicable*)

• Student’s final ENG course grade (if applicable**)

* If students took the ILA while they were currently enrolled in one of the first-year or capstone ENG courses, they were considered to have taken the ILA as part of the course requirements. 

**If the students completed the ILA while they currently enrolled in one of the first-year or capstone ENG courses, we included their end of term course grade as part of the data reviewed because we compared completion of the ILA program and program scores to the end of term ENG course grade. 

Why Did We Choose to Collect PII Data From Our ILA Program?

Faculty needed to be informed regarding students’ participation and completion of the ILA program in the first-year and capstone composition courses, therefore we needed to collect PII data to provide this data to them.  In order for a faculty member to enter specific grades for a specific student, we needed to know scores for modules at an individual student-level. The ILA data collected and analyzed for English courses was shared with faculty for only their students (meaning faculty received data about students on their particular rosters, with the exception of the chair who received data for all course sections).  Overall data, as well as particulars such as how our students were performing by division, or by a categorical group such as gender or degree program, were shared with members of the college’s executive council and college-wide so that data could not be tied back to individual users. 

For us to collect PII, we needed faculty buy-in and PII allowed us to convince faculty of the value of the ILA program offered by the library.  From the beginning of the ILA program, our strongest supporters were English faculty. After one course semester of piloting the ILA with the ENG faculty chairperson’s courses sections, he agreed to have the ILA become required course activities for the ENG 176 course.  This meant, however, that we needed to collect identifying information from students because faculty needed to enter scores for students for the specific ILA components in the gradebook. Without collecting data in a PII-format, this could not be accomplished. 

Collecting PII for the ILA program also enabled us to determine how information literate our students were by other categories such as academic division (undergraduate, graduate, and doctoral), gender, and degree major.  We could then address specific populations concerns more effectively. For example, we knew female graduate students scored better on the ILA program than male graduate students so we could target male graduate students with specific emails using our email campaign software ConvertKit with remedial support.  If we knew that undergraduate marketing majors performed the most poorly on the “cite” module, then we could design supports to specifically address this. 

Conclusion and Next Steps

For our next steps for the ILA program, we will compare students’ Open Athens logins, an identity access tool, which allowed us to learn, who accessed which electronic resource (EBSCO, ProQuest, Adam Matthew, Sage, etc.), to the scores they received in the ILA program, to their graded course assignment, and to the students’ end-of-course GPAs in their ENG 176 class.  We are curious to see if students who do login into our databases (we don’t know if they download articles) will have higher end of course GPAs than students who do not login to our electronic resources.  

Transparency with stakeholders like faculty is essential for libraries to receive funding from administration, grant funding agencies, and state legislatures; silos are a detriment to libraries because if our stakeholders do not know how librarians add value to student success then libraries won’t receive funding for programs.  Only when librarians collect PII for a service or program like our ILA program can librarians make meaningful connections that illustrates student success. Otherwise, libraries will continue to only report usage numbers which does not how value to the parent organization. Libraries will only be valued when librarians work with faculty in a model that faculty are familiar with already.  Collecting PII and attributing scores to course work students complete is a model faculty are familiar with doing.

References

Association of College & Research Libraries.  (2004).  Information literacy competency standards for higher education.  Retrieved from www.ala.org/acrl/standards/informationliteracycompetency#ilassess.

Bandura, A.  (1977).  Self-efficacy: Toward a unifying theory of behavioral change.  Psychological Review, 84(2), 191-215.  https://doi.org/10.1037/0033-295X.84.2.191

Jones, K., and Salo, D.  (2018).  Learning analytics and the academic library: Professional ethics commitments at a crossroads.  College & Research Libraries, 79(3), 304-322.  Doi:https://doi.org/10.5860/crl.79.3.304

Mery, Y., Newby, J., and Peng, K.  (2012).  Why one-shot information literacy sessions are not the Future of Instruction: A Case for Online Credit Courses.  College & Research Libraries, 73(4), 366-377.  Doi:https://doi.org/10.5860/crl-271

Michalak, R., and Rysavy, M. D. T.  (2016).  Information literacy in 2015: International graduate business students’ perceptions of information literacy skills compared to test-assessed skills.  Journal of Business & Finance Librarianship, 21(2), 152-174.  https://doi.org/10.1080/08963568.2016.1145787

Michalak, R., Rysavy, M. D. T., and Wessel, A.  (2017).  Students’ perceptions of their information literacy skills: The confidence gap between male and female international graduate students.  Journal of Academic Librarianship, 43(2), 100-104.  Doi:http://dx.doi.org/10.1016/j.acalib.2017.02.003

Michalak, R., Rysavy, M., Hunt, K., Smith, B., and Worden, J.  (2018).  Faculty perceptions of plagiarism: Insight for librarians’ information literacy programs.  College & Research Libraries, 79(6), 747-767.  doi:https://doi.org/10.5860/crl.79.6.747

Oakleaf, M., Whyte, A., Lynema, E., and Brown, M.  (2017).  Academic libraries & institutional learning analytics: One path to integration.  Journal of Academic Librarianship, 43(5), 454.  Retrieved from https://search.proquest.com/docview/1961425401?accountid=38531.

Rysavy, M. D., Michalak, R., and Hunt, K.  (2018).  Information literacy education for first-year composition students: A cases study of three deployment modes.  In I. Bouchrika, N. Harrati, and P. Vu (Eds.), Learner Experience and Usability in Online Education (pp. 85-121).  Hershey, PA: IGI Global.  doi:10.4018/978-1-5225-4206-3.ch004  

Sign-up Today!

Join our mailing list to receive free daily updates.

You have Successfully Subscribed!