Monday, November 19, 2012

Conference Notes LAC 2012: Parallel Session III (Teaching and Learning II)

Parallel Session III: Teaching and Learning II
Date: October 30, 2012, 10:30am

I. "Project RAILS: Rubrics, Results, & Recommendations."

  • See the RAILS website at http://railsontrack.info.
  • The purpose of RAILS: A clearinghouse of information literacy rubrics. It investigates rubric reliability and validity. Another purpose is developing training materials (I will be looking through this site quite a bit in the coming days). 
  • Case study from Belmont University, TN. They integrated rubrics into the general education curriculum. Taught at the freshman and junior level courses. 
  • Case study from University of Washington-Bothel. Make rubrics scale.
    • Identify artifacts that reveal actual information literacy learning. 
    • A good rubric is detailed with analytic, precise, explicit performance descriptions. 
  • Analytical rubrics are more effective than holistic ones. 
  • Norming is critical for establishing a shared understanding of the rubric and achieving greater inter-rater reliability. A good rater is one able to work with others. 
II. "Impact of Library Instruction in Different Academic Disciplines: An Analysis of Student Transcripts and Course Syllabi."

  • This presentation reported on work done at the University of Wyoming. 
  • (This was probably the presentation I found most useful and intriguing of the set. I think there are some things here I can bring back to my library). 
  • The question: does information literacy instruction make a difference? 
  • See also article by Melissa Bowles-Terry  (the presenter) in EBLIP (Evidence Based Library and Information Practice) 7.1 (2012): 82-95. (Adding this article to my reading list). 
  • See also the book Academically Adrift. (I have to add this to my reading list. I have been hearing a lot of buzz about it. I've seen reviews and rants about it too, so probably should read it myself to decide). In the book, check the findings relating to CLA tests for college majors. 
  • Significance: 
    • Data shows library instruction can make a difference in student success. 
    • Students in under-served majors are not receiving library instruction. 
    • Each under-served major has either an IL-related departmental learning outcome or an inquiry-based assignment. 
  • You can collect GPAs and correlate them to majors and library instruction. 
    • Research assignments + library instruction = Better student outcomes. 
  • The idea is to make an information literacy audit (this is the idea that intrigues me. Need to read more on this). 
 III. "Rolling it up: the Evolution of an Information Literacy Assessment Plan."

  • Report on work done at Keene State College.
  • Started by administering the SAILS test due to calls for assessment. 
    • Use it as benchmark rather than to compare your students from one year to the next.
  • Moved to a new "tiered" assessment model in 2011. This then was reinforced by a second tier focusing on IL instruction. 
  • Librarians were well-positioned for formative assessment. On a larger campus, feedback would likely be aggregated. (I think we can explore this at my library given our smaller population). 
  • Tier 2: Informing IL Curriculum and Assessment: 
    • Identify IL student learning outcomes (SLO) for every session. 
    • Rubric assessment. 
    • Classroom assessment techniques. 
    • Data on use of CMS and LibGuides. 
    • Data on faculty support. 
    • Tracking information literacy outcomes at the reference desk. 
  •  Some steps for one task force of librarians: 
    • IL curriculum maps. 
    • Identify department outcomes. 
    • Identify instruction delivery models. 

 IV. "Assessment of Information Literacy as a Student Learning Outcome: Overcoming Barriers and Achieving Standards."

  • Report on work done at Simmons College. 
  • Analyzed 326 decennial accreditation self-studies. 
    • 228 (69.9%) include IL in the document. 
    • The majority place IL within undergraduate programs or general education. 
    • Majority of instruction occurs at the course or class level. 
    • Very little evidence of program integration. 
    • In terms of assessment, 116 (35.6%) institutions assess IL as a student learning outcome. Most at the course or class level. 
      • 23.6% used surveys (student perceptions mostly). 
      • 21.5% used tests. 
      • 14.7% use class/course evaluations (again, student perceptions). 
      • Less than 1% assess IL through capstones or portfolios.
  • Obstacles in integrating IL: 
    • Resources, time, staff (a.k.a. the usual suspects)
    • Faculty as access point. Buy-in is an issue. 
    • Lack of consensus.
      • On terminology. 
      • On roles. 
      • Student understandings and needs. 
      • IL as an "orphan of the curriculum" due to disconnect between students, faculty, and librarians. 
      • Differing cultures. 
      • Lack of leadership for IL. 
  • Talk of accreditation can help bring administrators on board. However, this motivates faculty less. For faculty, talk more about student learning, what they desire to achieve and about closing the gaps. 
  • What librarians can do: 
    • Tailor communication to your audience. For instance, "selling" to faculty: students will do well/better, meet learning outcomes, etc. 
    • Implement an assessment plan. 
  • We have existing standards with objectives and performance indicators (ACRL materials). Use them. 

No comments: