Tag Archives: assessment

Library Assessment Conference 2016

” If you are not at the table you might be on the menu”

Plenary Session 2 – Keynote III: Brian Nosek, University of Virginia, Promoting an Open  Research Culture http://projectimplicit.net/nosek/

Space:

Reading Library Spaces: Using Mobile Assessment to Complete Your Library’s Story by Tobi Hines, Cornell University, Camille Andrews, Cornell University and Sara Wright, Cornell University

  • SUMA – most useful for asking a specific question
  • Improvements — Optimizing the screen real estate, adding a multiplier button, managed list of the most popular configurations

Evidence-Based Decision Making Using New Library Data, by Heather Scalf, University of Texas Arlington

  • Sampled 4 times a day for 3 weeks – found over 400 students studying at 2am (we generally have only 40); learning who is in the library by having swipe cards in and out  – used that for example to keep coffee shop open past 10pm; found out it’s mainly engineering students. Used EZproxy to determine habits of online students.

Driving the BUS: A Multimodal Building Use Study and Needs Assessment  by Mandy Shannon, Wright State University  

  • Building use study (Current use patterns and constrained needs)and needs assessment (prospective and unconstrained needs) – two different apples. Study – Semester, week, day. 2 days a week, 6 times a day, 6 weeks a semester.
  • Data: gate counts; used SUMA (zone based analysis – as a whole on some floors and zones on other floors according to noise levels, and grabbed info on tech use, furniture, and size), white boards for voting and why; questionnaire on random tables; photographs by those doing use counts that can’t easily come up in other data count methods. (ALL DONE IN SPRING TERM). Needs assessment survey in Fall with office of institutional research. Analyzed data = variety, diverse, it depends.  Still value quiet.
  • Tip – build in planning time! Work with a non library entity for perspective.

Don’t Dismiss Directional: Analyzing Reference Desk Interactions to Develop an Evidence-Based Content Strategy for a Digital Wayfinding System Christine Tobias, Michigan State University 

  • Developed their content via Ref desk transaction — looking at directional questions (like we are doing with concierge data!) 
  • MSU Digital Signage Working Group: allowed library to centrally manage signage, created formal guidelines for future digital signage installations (this was in collaboration with the whole university). This group included UX librarians/staff
  • Content strategy:  Signs had to include school brand, campus maps, emergency information in addition to wayfinding;  organized directional questions into broad categories (ex. events/exhibits, photocopiers etc) Renovated floor maps, directory, event scroll, weather on 1 screen

Shh Stats: Mining the Library’s Chat Transcripts to Identify Patterns in Noise Complaints Jason Vance, Middle Tennessee State University

After the Ribbon Cutting: Creating and Executing an Efficient Assessment Plan for a Large-Scale Learning Space Project Krystal Wyatt-Baxter, University of Texas at Austin Michele Ostrow, University of Texas at Austin

  • Repurposed staff space into media lab, active learning classrooms, and writing center.
  • Tips – use mixed methods, show users the changes, start with loose policies and grow them, maybe sure you are super granular in assigning who does what. And – Take workflow into account to schedule time intensive methods for when you are less busy

Lead Users: A Strategy for Predictive, Context-Sensitive Service and Space Design Ameet Doshi, Georgia Institute of Technology  Elliot Felix, brightspot strategy

  • “The future is already here but it’s just not evenly distributed”  – william gibson
  • Eric Von Hippel –  book – Democratizing Innovation  “lead users”
  • Everett M Rodgers  – book  – Diffusion of Innovations  “early adopters”
  • Methodology:
    • ID Lead users (early adopters) top 15%  – ask around in the library for who falls in this group; looked to advisory boards
    • Engage through interviews (1st), workshops,  shadowing (quietly, unobtrusively), journaling (for a week) –  then they synthesized the data with post it notes to get insights.
    • ID workarounds, innovations — looked for key moments, discovery times, when they were growing, creation times, showcasing moments
    • Compare their behaviors to how other users are trending – what could they learn from those people?
    • Create  concepts based in lead users ideas and new ones to meet their needs. (gave some examples)
    • The methodology will only get you so far, you have to cultivate the empathy of your users so they will open up, share and be willing to allow you to shadow them.

From Data to Development: Using Qualitative Data to Create New Ideas and Solutions Ingela Wahlgren, Lund University, Åsa Forsberg, Lund University

  • 2nd largest university in Sweden.  A 62 foot reference desk!!
  • Touchstone tours – users showed librarians a tour of the key touchpoints the person used when visiting the library (30 min)
  • Cognitive mapping – had users draw their vision of the library, change color pen every 2 minutes (for 6 min total) so you can see what they saw as important (drew first) then interviewed them for 10 min afterwards (EX:  first showed the cafe, then commented on the so-so exhibit, then drew the restrooms, and then sits in his spot, then nothing much else except he lastly wrote “I do not know what happens in here”  haha!)
  • Then spent 2 days in a conference room to analyze data and do affinity mapping and analysis.  EX change – moved their digital sign to near the restroom where people were queuing and would see and read it!
  • Sorted using how now wow matrix http://gamestorming.com/games-for-decision-making/how-now-wow-matrix/
  • Insights:  Can gain many insights from just a few people &  Asking people in person gives a much higher response rate —  went into the library and just asked people instead of sending an email to a ton of people

Space: Describing and Assessing Library and Other Learning Spaces Bob Fox, University of Louisville, Steve Hiller, University of Washington, Martha Kyrillidou, Quality Metrics, LLC Joan Lippincott, CNI

 

 

Assessing to Transform an Aging Learning Commons: Leveraging Multiple Methods to Create a Holistic Picture of Student Needs Jessica Adamick, University of Massachusetts Amherst Sarah Hutton, University of Massachusetts Amherst

  • Desk stats (libanalytics)  – from  3 different service points in one area. Use Tableau to visualize. Headcounts –  every hour for a 6 weeks a year
  • Microclimate spaces  analysis – classroom, presentation style practice and group study space (what worked and what didn’t, place to test before developing long term)
  • Focus Groups  – Had school of management, do it they did 10!
  • Ethnographic research  – used anthropologist on campus, and her students did the work, part of their syllabus,  they with many different methods
  • Workflow studies too
  • Findings and Recommendations!
    • Wayfinding and communication problems identified.
    • Consolidated ref, ill and circ
    • Individual spaces in open area (semi-enclosed seating).  Students like microclimates – smaller areas within the larger area.

Consulting Detectives: How One Library Deduced the Effectiveness of Its Consultation Area & Services Camille Andrews, Cornell University Tobi Hines, Cornell University

  • Context/Goals
    • Mixed methods  assessment for consultation area
    • Multiple help desks in 2014 which was confusing with different hours
    • Declining reference traffic,
    • Wanted to improve the visibility of the internal and external consult services
  • Methods:
    • Completed lit review, site visits and environmental scan
    • Focus groups
    • Interviews with staff
    • Prototype spaces and surveys
    • Space observations
    • Reference transaction analysis
    • LED TO: people don’t mind separate desks, but need to know where to get what ⇒ Wayfinding!  Highly visible first point of contact, but a quieter area for longer ref consultations. Also – improve signage, merge poster printing and circ, made room for in depth consult services.
  • RESULTS:
    • New, eye-catching signage that listed each service available.
    • Everyone that works at the combined circ/printing desk knows how to do everything, student coverage has improved.
    • Reconfigured space using existing furniture and tech for consult space, available for group study after 6pm.  Used SUMA to get info on size of groups using it after 6pm.
    • Added a (paper?) survey so students could complete
    • Staff survey to ask what worked and what could be improved.
    • More visual and acoustic privacy wanted — will be purchasing furniture that will work for this.
    • Will add digital signage at entrance  and will have tablets on stands at various workstations
    • NOTE: Important to know what other institutions are doing BUT make sure you listen to YOUR users.

Public Workstation Use: Visualizing Occupancy Rates Jeremy Buhler, University of British Columbia

  • Assessment as map-making – showing a rough sketch of how/where people are and what they’re doing/using.
  • Multiple campus study.  But hadn’t researched basic questions about numbers, distribution, placement.
  • Simple data – how many logins/logouts – but doesn’t tell about occupancy.  Excel allows to extract occupancy rates, which gives a much richer dataset – same pattern, but the bars show concurrent occupancy rather than number of logins/logouts.  These bars show when they’re at full occupancy.
  • Studied occupancy rates/patterns at specific locations – realized they were under-occupied in some spaces (people don’t know they’re there), at 100% occupancy in others (that are obvious).
  • Set up for anything that has target based time stamps so that they could set goals for workstation use.  (see slide for url to show tableau data visualization)

Library Snapshot Day, or the 5 Ws—Who, What, When, Where, and Why Are Students Using Academic Library Space: A Method for Library User Experience Assessment Gricel Dominguez, Florida International University Genevieve Diamond, Florida International University Enrique Caboverde, Florida International University Denisse Solis, Florida International University

  • one day library snapshot day!
    • 1 day, 3 hours, observational
    • 6 public floors, 9 zones
    • Teams of 2-4 per zone (2 single researcher in 2 zones)
    • 34 factors in 3 categories under observation
    • IRB
    • A little ethnographic work too, tweet campaign across campus including the provost, marketing
  • Purpose – to ID user behavior and needs and find areas for improvement and promote library as a place
  • Methods – sweating sweeps and observation
  • What they did:  ID zones/locations to observe, came up with factors, choose a time of term, created an intense checklist, scheduled it with everyone, did practice run through, include staff too!

 

Organizational Issues/Other 

Assessment by Design: A Design Thinking Project at the University of Washington Libraries Linda Garcia, Washington State University Vancouver (linda.garcia@wsu.edu) Jackie Belanger, University of Washington    http://libraryassessment.org/bm~doc/belanger-handout.pdf

Space Assessment Via Tableau  – U of Washington Libraries  https://visualibrarian.wordpress.com/2016/10/07/library-space-assessment-in-tableau-a-step-by-step-guide-to-custom-polygon-maps-and-dashboard-actions/

Used Tableau to visualize the data https://public.tableau.com/profile/libraryviz#!/vizhome/REACH_1/REACHDashboard

Using Peers to Shed Light on Service Hours for Librarians
Hector Escobar, University of Dayton @greenghopper Heidi Gauder, University of Dayton

  • Staffs a ref desk about 40 hours a week
  • Surveyed peers institutions about their reference staff on the desk
    • What does your reference staffing model look like now
    • What are the roles of your librarians
    • Have there been shifts
  • Combined service desk was most, then traditional service desk, then other then ref consultations only (smallest)
  • Results:
    • Mixed bag of who staffed – (see slide)
    • Have the number of public service hours for your professional librarians declined?      60% Yes 40% No
    • 12 of the 17 will change or have changed service approaches by decreasing service hours.
    • Decreases because of liaison activities, less consultations, more chat reference
    • How can you be equitable Workload Policy. (fairness important)
    • http://www.slideshare.net/HectorEscobar20/using-peers-to-shed-light-on-service-hours

Active Learning with Assessment
Katharine Hall, Concordia University Meredith Giffin, Concordia University

  • Developed staff 2 part workshop with active learning exercise to learn about assessment
  • Workshops also met strategic plan initiative to increase skills/share expertise
  • The same people who did not like it before still did not not like it afterwards!
  • Scenario breakout groups good chance to work with others and get different perspectives on assessment
  • Active learning component of the workshop was successful

 

A Comparison Study of the Perceptions, Expectations, and Behaviors of Library Employers on Job Negotiations as Hiring Employers and as Job Seekers
Leo Lo, Old Dominion U  Jason Reed, Purdue

  • 74% of our profession have negotiated a job offer in the past.  This is low compared to national average which is 82%.
  • Why don’t some people negotiate? Afraid to jeopardize the job offer.
  • What you should know:
    • Employers expect candidates to negotiate.
    • Only 71% have withdrawn the offer.  If so, it’s because salary demanded was unreasonable. Or issues arose during background check, or candidates did not accept one or more elements of the offer. Or they suspected the candidate was holding out for another position.
    • Didn’t find significant gender diff in negotiation, but older respondents were less likely to negotiate
    • People who had more jobs tended to negotiate MORE
  • How much flexibility is there for salary?  Seems like there is more depending on how senior the position was.
  • Human psychology at work here  – as employers we believe there is flexibility but (same people) as job seeker do not see there is flexibility!
  • Australians (this was presented at an Australian Library Association conference) believe Americans are “more proactive” and more likely to negotiate. So cultural norms or assumptions about cultural norms may affect behavior.  
  • Questions for these researchers to ask later: did you distinguish between staff-level and faculty-level (tenure specific) librarian positions? did you take into account whether job seekers were “currently employed” at the time of their search?    
  • READ: “You’re hired! An analysis of the perceptions and behaviors of library job candidates on job offer negotiation”   The Southeastern Librarian 64(2), 2-13.

Impact of Academic Libraries on UG Degree Completion  http://crl.acrl.org/content/early/2016/09/27/crl16-968.full.pdf