Wednesday, April 24, 2013

ACRL 2013 Conference Notes: Preconference on "Planning, Assessing, and Communicating Library Impact."

Date: April 9, 2013, 8:30am-3:30pm
Topic: "Planning, Assessing, and Communicating Library Impact: Putting Standards for Libraries in Higher Education into Action."
Presenters: Lisa Hinchcliffe and Debra Gilchrist. 

The preconference topic is based on the Standards for Libraries in Higher Education document, available at this link: http://www.ala.org/acrl/standards/standardslibraries. Note there are additional links and materials at the site, so do take a look.

A question to begin: how do we contribute value.

Some reasons people gave for being at this session:
  • Having the standards. Tie them to university goals. 
  • They "made me" come (this was said in humor). 
  • To learn from others. 
  • Libraries are being judged. We need to know how to reply. 
  • Accreditation. We need to look and be different to the evaluators. 
The culture we desire here (and one I think we should desire in most places): collaborative, informal, conversational.

Context is important to understanding what is valuable. . .
  • . . . to us. 
  • . . . to faculty. 
  • . . . to users. 
  • . . . to administrators. 
The most effective library ever. How do you know? Then, how do you make it even better? This is what standards are for.

Big issues at institutions:
  • What are the campus administrators talking about? 
  • What is in the media? 
  • What key words do you hear?
Some answers to the big issues given at the session by participants:
  • Budget
  • Retention
  • Student learning
  • Student success
  • Attainment
  • Impact of e-books/e-learning
  • Job attainment 
  • Role of libraries and liberal education (this topic is certainly of interest to me now that I work at a liberal arts college)
  • ROI (that is "return on investment" for non-librarian and non-business folks)
  • High impact educational practices
  •  Reputation
  • Future of higher education and "threatening" MOOC's and commercialization
  • Retaining relationships with graduates
  • Impact on research output
  • Future of the faculty
  • Intellectual property
We need to make our case. Ask if we are investing the college's money wisely.

Our institutions are diverse. Choose which standards can guide your works and be more useful to your school.

There is value in collective thinking (as a quick illustration, see list of answers participants gave above). Start the work by engaging other colleagues in your library, then the faculty.

We looked at two evidence models: the evidence-based model and the outcomes assessment-based model. In this context, evidence is the stuff we collect to help us make judgments.

  • Evidence-based model= Principles (what does the library do?) >> Performance Indicators>> Evidence
  • In the Evidence-based model, we just document the activity. Impact is not as needed (though it is desirable to have and know). Note that you may need to develop models and services at your library before you can measure an impact. 
  • Outcomes assessment-based model= Principles (what does the library influence?) >> Performance Indicators>> Outcomes >> Assessment>> Evidence  
# # # # # 
An example of the Evidence-based model (which you could put in a nice table)

  • Principle (from the standards): Collections (this would be #5 from the standards)
  • Performance Indicators: The library provides access to collections aligned to research, curricular focus, or institutional strengths. 
  • Evidence: a liaison program; collecting class syllabi; (specific) library policies.
  • (You could then add another column to your table for tracking. This would be so you know where the stats and data are kept)
# # # # # 

The key question for the Outcomes Assessment-based model: How are faculty, students, researchers, etc., changed as a result of our efforts? Here is the definition of outcomes. Here we look at impact. What do our users do, or what do they do better as a result of our services and work?
  • A simple formula for an outcomes statement: "Because the library , {user group name} [verb phrase]."

# # # # # # 
An example of the Outcomes-based model

  • The outcome: Campus community implements information literacy as a collective endeavor. 
  • Criteria: How will we know we are successful? What will be happening? 
    • X% of departments include IL  learning outcomes in key courses, such as major gateway courses, capstone courses, composition courses, etc. 
    • X% of faculty embed information literacy into course assignments. 
  • Actions: What will we do to make this happen?
    • Liaisons collaborate with department faculty to strategically identify courses and design and embed outcomes. 
    • Liaisons discuss relevance of discipline-specific IL outcomes with departments. 
    • Develop discipline-specific assignment example website. 
    • Liaisons collaborate with faculty in assignment design. 
  • Evidence: How will  you collect information? 
    • Curriculum map. 
    • Instruction log/Activity spreadsheet and Librarian/Faculty Collaboration Log.
This is an example. In this example, you can simply find how many faculty are doing something. There is no need (or desire) to evaluate what is being done (and reassuring them that they themselves are not under evaluation is important). The curriculum map tells you by department and program where the outcomes are embedded. The actions are things you do to get things done. 

# # # # # 

On constructing an outcome. This is what will faculty/students/researchers/librarians do as a result of engaging with the academic library.
  • A formula: Who? + Verb + Impact of Experience. 
  • Example: Campus Community + Implements + Information Literacy as a Collaborative and Collective Endeavor. 
This is a formula that makes sense. The rubric to measure it here is more criteria. Your verb choice "sets the stage." You don't always have "to count," but you can judge through your professional lens, which can include asking students, say at reference, questions as you teach narrowing terms, for example.

Another outcome example:
  • Using Performance Indicator 4.1 from the standards document: The library organizes information for effective discovery and access. 
  • The (possible) outcome: Students retrieve information from the library catalog and electronic resources that enhances their ability to engage course material and assignments. 
  • Note: Go past the evidence (you can say that your library provides the tools to achieve the outcome) to see what the students can do. The verb "enhances" needs to be better defined, be more active. Also, not just "retrieve," but "retrieve relevant information" would be better.
Summarizing on outcomes then:
  • The outcomes should be inspired by the Standards but determined by Institutional Priorities. 
    • Mission/Vision
    • Strategic Plan
    • Accreditation Guidelines
    • Campus Initiatives: general education (which we certainly have here on my campus), globalization (got some of that too), etc. 
  • Some approaches to develop the outcomes you could use: 
    • Brainstorm
    • Map to other documents
    • Benchmarking with other libraries
  • Think what stakeholders you want/need involved and at what stage: librarians, faculty, students, administration?

Refer to Planning/Assessment Cycle document (6 question design) (handout from session)
  • Begin looking at the context you work in: your library, your campus, mission and values. Our library will/may have different outcomes from other libraries; again, diversity. 
  •  Look beyond the LIS literature (something I try to do). Look at literature about students and higher education in general. Then consider how it applies to the library and how the library makes those higher education elements work. 
  • Your administration: listen to their messages and how they differ from one audience (what they say for the campus, for example) to another audience (say, to the trustees or the outside community). 
  • Some outcomes you may evaluate every year. Others could be every other year or two. 
  • When possible, look at your own internal documents. 
  • How is the library story told? How and where can it be embedded? (This is a question I often think about)
  • For outcomes, sometimes stakeholders may choose actions (action verbs)  you might not choose. Let them choose, guide them gently as needed; this fosters ownership. 
  •  Where the institution is holistically may not be where the students are. You need to make this distinction. So, you could/would have different outcomes for one group or another. 
Assessment is not research. Assessment does not really inform practice. You could do research, but that is a separate thing. You don't necessarily prove the impact. However, research can then come into play to tell the story.

Outcomes can be complex. Outcomes are things that matter.
  • Remember not to incorporate too many things in one outcome. Have multiple outcomes to tell your story. 
  • Know when to stop. And make it manageable. 
How do you measure the outcomes? Here is where criteria come in.

# # # # # 
Criteria example

Outcome: Researchers easily access relevant materials.
Criteria: # of articles downloaded/document delivery.
Criteria: # of faculty express satisfaction/being able to access (as measured by a survey).

You can choose a percentage of faculty with expectations exceeded, met, or not (for instance, some of this data can be found via LibQual+).

# # # # # 

Criteria can be used individually or combined to achieve specificity. 
  • In some cases, criteria could make you rethink your outcome. 
  • It is easier to say what your action will be than saying how you will measure it. 
  • Question: what is the tension between aspiration and reality? This can vary from library to library. Criteria can be generated by us or externally imposed (and this does make a difference). 
  • Watch what is happening and being assessed across disciplines. Think strategically and design intentionally. 
 
Be inspired by the standards, not constrained by them. Look for library outcomes, then see how unit/department outcomes feed into the library. Yes, you can (and likely will have) unit/department outcomes.

On actions, you may take the at other times out of the cycle. Action is what is done/happens to make the outcome happen. Some examples:
  • Use of social media to market and promote library space/the library as a destination. 
  • Use of signage (print and digital). 
  • Student survey to determines marketing needs/targets. 
  • Promotion of services to faculty, say via a liaison program. 
  • You could think of "conversion rates" (the retail concept. Say how many people come into the store then how many "convert", of those coming in, how many are buyers. I will admit I may be a bit skeptical of this, then again, I am always a bit skeptical when people want to inject retail ideology to our public services). 
We need to convey/market that we are not just "support." We are partners and collaborators with the faculty and campus. Check the standards for performance principle and for personnel topics.

It can be helpful to look at other campuses but be careful. Their outcomes likely focus on specific local goals and outcomes. So don't just adopt someone else's stuff wholesale.

Now to evidence: collecting the data. If you have done the previous steps, this is easier/a bit more evident. Decide on what you are collecting then on what is sufficient.

Another sample outcome: Faculty utilize physical and virtual environments that facilitate their class preparation and research.
  • Sample criteria: 70% of faculty indicate library search tools assist them. . . . 
  • Sample evidence: faculty survey. 
 On evidence, you may end up collecting "sensitive" data. Be sensitive to issues relating to human data. Be aware of any laws, regulations, etc. related to student data and privacy. 
  • Know your campus IRB (Institutional Review Board) or campus assessment unit. 
  • Results of research can serve as assessment date. However, assessment data is not usually research (so, is this sort of like "all bourbon is whiskey, but not all whiskey is bourbon" thing? Yes, small humor break). 
  • Performance evaluation is separate from assessment. 
Analysis: What you do with the data and what we learn from the evidence and data.
  • Trust is an issue. We need to reassure others on trust. 
  • It's not just what the data says, but why? 
  • "Assessment is a conversation, not just a report." --Deb Gilchrist. We need to ask who needs to be in the conversation and at what time or stage. 
  • Analysis can generate more questions. 
  • You can have hypothetical discussions before the data analysis is done and reported. What if the data reveals X instead of Y? Doing this kind of scenario exercise can be a conversation tool; it can help you consider what might be done. This can also help ameliorate others who may just challenge the data for the sake of the data. 
Communication is crucial throughout the process. Where is the information available and accessible? This is part of closing the loop (also part of just knowing where the heck you put things so you can find them later). 


# # # # # 
Sample benchmark exercise 
 
(something you can try when teaching this concept)
 
Collect a list of what each person at a table had for lunch, then rank who had the best. 

Question: how do you judge? (answers the groups came up with):
  • Nutrition, taste, attractiveness of the food. 
  • Are the ingredients fresh? 
  • Price
  • Personal satisfaction
  • External reviews (to get validation)
  • Ambiance of the eating establishment
# # # # # 

Benchmarking: we look to compare ourselves to another group, usually a peer community. How are we in relation to others with similar strengths and weaknesses. You can also do internal benchmarking; this is done over time, say one year to the next.
  • You could use a peer library that you choose. Note that your campus may have a specific peer group in mind, so be aware of this.
  • In benchmarking, we use metrics. For rations, decide which ones you want to use. 
  • "Data are what data are."
  • Ratios are more important than descriptive data. 
  • Remember that data is used to make arguments. 
  • Data are not decisions. Data are evidence of decision-making. It is important to be conversant with data and for your staff to be conversant too. Report out and report in. 
 On telling the story. Is the foundation of the house you want to build strong enough? Is the story about the library? Or about the students? You get different dimensions of the same story based on what you ask and the choices you make. 

Consider using the organization inventory document (handout provided during session) as you start conversations. Do you have a conversation with your director? with your staff? with others?

On culture: 
  • Think perhaps of another time of major change and reflect on successes and obstacles. 
  • Leading change has to be intentional 
We are informing our practice to better serve our students.
  • Be mindful. 
  • Reassure others of trust. Promises made must be kept. 
  • As leader, you model the way. 
  • Ask what challenges the library faces and what strategies you can use to address the challenges. 
Finally, collect your own data as needed. If no one else is collecting it, you collect it. 
 
(Note: as soon as I can get a working scanner, will try to put up the two handouts)
 
 
 
 

No comments: