Friday, June 20, 2014

SAALCK Assessment Workshop 2014 Notes

This workshop took place on Friday, May 30, 2014 at Crabbe Library, Eastern Kentucky University Campus. The featured speaker was Megan Oakleaf. The workshop also featured a panel of speakers from Murray State University. The workshop was provided by SAALCK. What follows are some of my notes from the event. As usual, stuff in parenthesis is usually my comments and additions.

Topic: Opening Speech by Megan Oakleaf on topic of "Academic Libraries and Institutional Impact Overview." (This presentation did have a PowerPoint. We did get a copy of it as a PDF--link to it on my Google Drive, let's see how it works out. A handout packet was included, much of it comes from a workbook by Megan Oakleaf, which can be purchased here. We'll probably order a copy for the library here.).

  • We began with some opening questions. 
    •  How to link library outputs to student learning outcomes? 
    • How do we develop an assessment plan?
    • How do we translate what we do into numbers? 
  • We got a small index card at the beginning of the presentation. The card had an owl sticker on one side, and a monster sticker on the other. The instructions were to, as we went along, jot down on the owl side any good ideas you heard or learned. On the monster side, you jotted down why the ideas could not be done, the challenges. 
    • (Ideas #1 I jotted down: Got validation of the value of reflective writing and other reflective exercises in assessment. This is something I have been starting to talk about in our library as something we need to be doing. Idea #2:  The library proactively assigning "research advisors" to support certain student groups. I think this could work for us here.)
    • (Challenges #1 is the ever present challenge of time. Challenge #2 is gaining advanced knowledge. One way I am acquiring it is piloting things, learning from them, and moving forward.)
  • What do we think when we hear "library value"? "Assessment"? 
    • The "elevator speech." Assessment is the tool, the measures. 
    • Value is the ROI (return on investment). Others would say how the library contributes to student success.One way to define success is helping with student retention and helping faculty be better teachers. 
  • See also The Value of Academic Libraries report (link to website. The report itself is available, but keep in mind it is a pretty big PDF file). 
  • "Satisfaction" measures are not really a big deal in assessment. We want to really measure outcomes in learning, so on. 
    • Yet we measure a lot of satisfaction and use. "Use" does not really measure value past a starting point. 
    • Difficult to measure what students do in academic terms. 
    • ROI studies do have limitations. It's one strategy, but it is not perfect. Also it is dependent on things like institution size. 
  • Ask how well the library contributes to overall goals of the parent constituencies. 
  • There is a trend in the library literature showing libraries moving from being passive to being active (this is not news to me really). 
    • Product>>Service. 
    • Collections>>Experience. 
    • Mediation>>Enabling.
    • Resources>>Educational Impact. 
    • Facility>>People. 
    • Access>>Sense-making. 
  • We build spaces, but we often fail to think how spaces will be used and what impact they will have on students.
  • Some recommendations/highlights from the Values in Academic Libraries report (report linked above):
    • Think institutionally. Be able to translate plans, etc. for other constituencies. What are campus leaders talking about in their speeches, so on? 
    • Ask: what do we enable people to do? And to answer that question, we need to know what people have done. 
    • As much as possible, use existing data. Your campus probably has a lot of data already; find out who keeps what data. Your library probably keeps a good amount of data as well. 
  •  Question to ask: what does your institution value? (what follows are some ideas from a brainstorm in the room. This list is mostly what the institutions would say, or at least what the people in the room say or think their institutions would say): 
    • grades
    • program completion
    • graduation
    • admission to graduate school
    • regional engagement
    • public service
    • economic development
    • athletes
    • serving international students
    • brand
    • (what I would say) good citizens and lifelong learning (no one said either one or something close to it, which honestly makes me wonder)
  • Students appear to acquire "information literacy" skills as a consequence of instruction, but assessments are scattered and episodic, not coherent and longitudinal. 
    • Much of the library instruction literature is about minutiae (I know; I read a lot of it). 
    • (I am thinking that for us here, use of the HEDS survey and keeping track of cohorts, we may be able to get some good data for assessment and improvement of our programs to better help student learning.)
    • Oakleaf is not much into pre- and post- tests. (To be honest, neither am I, but convincing some people in higher positions of this can be a bit of a challenge) Where the students come from does not matter as much as what they leave with. (This is what really interests me. However, I would add we do need to know at least some of where they came from in order to know where they are going. How much of where they came from do we really need to know has been a point of contention here for some of us)
  • Another question: how do you know scholarship is a conversation? This is where reflective writing and performance assessments, such as concept maps, come in. (This is something I have started to discuss here. We still have a ways to go)
    • Time and scaling are challenges. It's rigorous, and we may need more skills to carry it out. 
  • We need to be familiar with learner and learning analytics. Know where and when student behaviors/inactions/activities are tracked. 
    • Help to find problems in curriculum in order to fix them. 
  • (A reminder to myself to review this: http://gypsylibrarian.blogspot.com/2013/05/acrl-2013-conference-notes-contributed_17.html
 # # #
Topic: "Conceptualizing and capturing library value."

  • Idea: students using company profiles, say from a database like Business Source. Ask if the students are using them to prep for job interviews? And if they do, ask if they did better on interviews. (To be honest, there was only one place that even mentioned this idea, and it was a few years back. The question was mentioned, but never explored, so I definitely see a research idea here.)
  • We need more information on library impact and faculty teaching. We do know, anecdotally, faculty citing library support for them having more time to do research, to develop lesson plans, etc. (Again, another investigation opportunity perhaps?)
    • How else do support faculty? Do you help in grant seeking and writing? In tenure and promotion? Keep track if you do. 
  • Idea: We need to learn about ILL (interlibrary loan) impact on students. 
  • Question: do students use library resources for prepping for and being involved in internships? 
  • Think of assessment as record keeping. Keep track of numbers as well as quotes and testimonials to enrich assessment narratives. 
  • I am thinking the library impact map can be a good exercise  to do in our library. Break it down for different units.
    • To begin investigating and prioritizing what to investigate, find out what the institution cares for. 
    • I would divide the chart by department/units of the library and go from there. 
    • Another idea: use a similar grid, but with library policies to look at policies' impact on patrons. 
    • Be careful not to be overwhelmed (which is why I would prefer the grid to be divided into sections). 
    • Again, on campus tours that stop at the library, make them more effective. Provide talking points (and try to measure impact with at least one survey question). 
    • Consider where are the "invisible" areas of the library that have an impact.
  •  In assessment, we often want "causal" questions, but "causal" can't really be done. You can't control every single factor. It is not a closed system.
    • Goal is to identify behaviors that lead to an outcome, positive or negative. Work then to improve the positive. 
    • What you want to show is a correlation. What contributes to the positive outcome, as part of other activities as well. 
    • Ask yourself: is it enough to describe the profile of successful students and seek to increase students that emulate the desired attributes?
    • To stave off criticism (say, from the usual "picky" faculty who feel the need to wave their "big stick" around), state your research up front. Say what something means and what something does not. 
  •   Consider your library communications. 
    • Which institutional values are reflected/emphasized in library communications? One part of communication goals should deal with library and institution goals (retention, learning, etc.). 
    • What part of our communications communicate impact on institutional focus areas? 
    • Keep in mind: different audiences may need different versions of (the same) information. 
  • Reminder note to review some of Lisa Hinchcliffe's work.  
# # #

Topic: Panel Presentation from Murray State University on "Assessment in Action."

  • Reporting on a campus and library project focusing on retention. 
  • If you decide to present (at a conference, workshop, a publication, so on), you may have to go through your IRB (institutional review board). 
  • They identified library users using campus data (students enrolled, faculty lists, so on), then used library data for identifying patron checkouts. (If you use something like Voyager for your library information system,  you are looking at things like historical library checkouts.)
    • They added ILLiad (interlibrary loan system) users. They do note they ran into some issues doing this, in part because ILLiad does not always "play nice" with other library systems. 
    • They used EZ Proxy data for electronic resource use. Their patrons use the "usual" credentials (campus e-mail user name and password). They make every user log-in whether on campus or off-campus. (Some attendees were skeptical of this, but the speakers claim that the campus complaints on this were low when implemented. I have to say I found that pretty impressive. In other campuses I have worked that, making such a suggestions would mean more than just "a few complaints.")
    • Most of the data is "yes" or "no." The idea is to see if there is a sense of a user community building up. 
    • Their data can now capture use by distance students as well. (For us, being a residential campus, this would not be a big concern. Maybe to track some off-campus students, say those traveling abroad.)
    •  Idea: providing documents with steps, how to talk to stakeholders, so on. 
    • Make sure that you can articulate the benefits in collecting data for assessment. 
    • Make sure you start early conversations with institutional researchers on the campus.
# # #

Topic: "Taking Library Value Home" (back with Megan Oakleaf). 

  • Reflection on ideas to take back. 
    • (Review parts of the library impact grid with our instruction team.)
    • Write something on our assessment efforts in a newsletter (for us, this could be done on our library blog.). An article on student outcomes and learning. (This can also add transparency.)
    • A presentation to faculty or a select group (for us here, for example, it could be to the Committee on General Education) on the Value of Libraries document.
    • Work more on collecting anecdotal information and testimonials. Things like filling out a small card at the reference desk. Sending out a small 2-5 question survey to students after instruction, so on. 
    • Quote: "you need to find the right key to unlock people's minds." This is especially true for resistant folks. So think ahead of time how you will address any resistance and answer any objections.
    • Assessment of LibGuides. Connecting to instruction and pedagogy. 
    • Work to build a culture of assessment into your strategic plans. 
    • Think big in an organized way. However, you can start small, but do start. 
 



No comments: