CAT Meeting Minutes 9.24.15

CAT Meeting
September 24, 2015
1:00-2:00 p.m.
Virtual meeting using Zoom
Members present: Jen Fabbi (San Marcos, Chair), Sarah Dahlen (Monterey Bay), Sue Thompson (San Marcos), Stephanie Alexander (East Bay), Laurie Borchard (Northridge), Ann Agee (for Diana Wu, San Jose State), Michele Van Hoeck (California Maritime Academy, Vice-chair), and Laura Gil-Trejo (Fullerton, Consultant).
Members absent: Tiffini Travis (Long Beach) and Felicia Kalker (Sonoma State)
Recapping Meeting Minutes: The group reviewed minutes from previous meeting and agreed that they reflect what we talked about.
Google Drive: Did anyone not receive access to the drive? If not, please contact Jen. A folder was shared that contains:

  • Approved charge
  • Members
  • Minutes from April and August meetings
  • Report that Laura did when CAT was originally formed and some other documents.
  • This will be our workspace for future initiatives and our committee history.


Toolbox Update: Toolbox is located at http://libguides.sjsu.edu/c.php?g=230377

  • The conclusion from our last meeting was to formulate a list of assessment contacts from all CSU campuses. Jen was able to collect this list from all of the library deans at the September COLD meeting. Laurie completed the list by adding email addresses (the list is located in the Google Drive).
  • Laurie sent a message to all of the contacts to get feedback on the toolbox via the embedded Google form. Members of the group confirmed they received the email.
  • Diana manages the Google form, and she may be the only one who is able to see if anyone provided feedback. Action item: We need to confirm this as well as confirm that no feedback has been received. We also need to have Diana give access to Sarah and Monica so they can both edit the toolbox and receive email notifications when feedback is provided.
  • The group agreed that we need to provide more specific instruction on the actual form. Monica will be responsible for changing the form to read:
    • For each section of the toolbox, please offer suggestions/feedback in the corresponding boxes below.
  • Do we launch as is with the promise that we will continue to work on it or do we wait? Action item: Jen will send another request for feedback to the group of assessment contacts with more specific instructions. She will also give people the option to complete the form or send the feedback directly to her. If she receives feedback, she will forward to Monica and cc: Diana and Sarah.
    • Specific prompts to consider: do you have your own favorite assessment book or article to share? do you have best practices that you follow (web link)? and how would you like to see this toolbox develop?
    • Deadline: October 16th


The value added of this resource, as expressed in our April meeting, will be to "share collective practices of institutions around certain areas, such as signature IL assessments." These shared practices should be the future focus of development of the Toolbox.
Report on IL Assessment Pilot Project (Laura):

  • Laura shared that the pre-survey at both Northridge and Fresno is underway with good response rates.
  • Members of this subgroup are: Lynn Lampert, Laurie Borchard, and Jennie Quiñónez-Skinner (Northridge) and Amanda Dinscore(question) and Dave Tyckoson (Fresno).
  • There was a lengthy discussion about next steps, specifically, how to rate student work that may be submitted by students during the post-survey. This could be 150-200 papers. The committee decided that the subgroup will select a rubric to use. Action item: Laura will coordinate a meeting to discuss rubric selection.
  • After a rubric is selected, the subgroup will share with CAT and take next steps to recruit potential volunteers to rate student work. This may be opened up to our list of assessment contacts across CSU libraries.
  • Additional rubric discussion:
    • We need to focus on a rubric that is targeted to first year students and can be applied to different projects.
    • We also want a valid measure.
    • It is important to look at reliability and validity.
    • The norming process will likely be the most time consuming aspect of the project to date.
    • Action item: If there are people on this committee that have rubrics they have used for other projects that are more general please send them to Laura.


Discussion on new CAT initiatives: We ran out of time for discussion here, but Jen reported that she asked all COLD deans to rank a number of areas for future CAT focus (see aggregate responses attached to these minutes). All 23 campuses responded. The priority areas are:

  1. WASC IL assessment, encompassing sharing and developing best practices, intersections between the core competencies of IL and critical thinking, and

signature IL assignments (IL assessment at campus level)

  1. Studies of how library use impacts student success (e.g., retention and GPA)
  2. Digital learning objects with built-in assessments, pre-/post- tests, shared

rubrics (IL assessment at Library level)
Before our next meeting, Jen will collect ideas from committee members on how best to pursue a new area(s) of focus.
Adjourned. Next meeting is October 19th from 4-5 p.m. Jen will send out another doodle to identify the best time and day of the week that would work for a regular monthly meeting.






























CAT Calibration: Step 1
9/4/15

My campus is:
Please rank the following potential CAT initiatives in order of interest to your campus. Those that are ranked as highly important and that match member expertise would be further developed into CAT work projects with the goal of creating infrastructure to benefit multiple campuses:
Please rank 1-8, one being highest ranking:
Numbers of 1, 2, and 3 rankings listed consecutively to the left of each item below. Items in red ranked highest by 23 deans/directors or their designees.
10, 3, 2 WASC IL assessment, encompassing sharing and developing best practices,
intersections between the core competencies of IL and critical thinking, and
signature IL assignments (IL assessment at campus level)
3, 4, 6 Digital learning objects with built-in assessments, pre-/post- tests, shared
rubrics (IL assessment at Library level)
0, 1, 3 System-wide pricing and administration of user surveys
3, 2, 4 New ULMS and best practices for interpreting data
0, 1, 5 How to communicate assessment data—dashboards, etc.
1, 2, 1 User experience assessment methods
1, 0, 1 Website usability testing
5, 10, 1 Studies of how library use impacts student success (e.g., retention and GPA)

Who do you consider to be your Library's key "assessment contact?"


More than one contact? Please specify: