HAPLR Logo

Hennen's American Public Library Ratings
The ratings are based on federal data but only the author is responsible for ratings on this site.
haplr-index.com  6014 Spring Street, Racine, WI  53406   USA

Share knowledge, seek wisdom. 

Home    Ratings      Top Libraries    Orders     Samples    Press    
Consulting      FAQ     Book     Presentations     Miscellaneous  


 

2010 HAPLR Edition  

Introduction

It is appropriate to start a new decade with changes to HAPLR but not every change that has been urged, of course.  This edition of HAPLR ratings is based on data published by the Institute of Museum and Library Services  (IMLS) in 2009.  The data cover reports from libraries that were filed in 2008, primarily on 2007 activities. 

There has been a change in the IMLS reporting cycle, speeding up when the data are published.  HAPLR will probably have a “Round Two” late in 2010 when the data filed in 2009 are available.  In evaluating HAPLR after 10 years, I have considered modifying the measures considerably. For this edition, I will stay with “Classic” HAPLR elements but for the next edition I will modify the factors considerably.  It is time to do so.  

This is the 10th edition of the HAPLR ratings.  Ten libraries made it into all 10 editions!  They are:

  • Bridgeport Public Library WV

  • Carmel Clay Public Library IN

  • Columbus Metropolitan Library OH

  • Denver Public Library CO

  • Hennepin County Library MN

  • Naperville Public Library IL

  • Saint Charles City-County Library District MO

  • Santa Clara County Library CA

  • Twinsburg Public Library OH

  • Washington-Centerville Public Library

The LJ Index authors note "As a measure of library output, the LJ Index does not purport to assess library quality, excellence, or value." 

Saying that a library index is a measure of library outputs is a bit like saying that the weather reports are only about temperature, wind speed, and relative humidity.  When the weather reporter gives 5 stars to a day, we expect it to be nice out!  When rating systems place a library highly the hope is that library users will find it an exceptional place to visit.  Any other interpretation is simply disingenuous.

I could say that HAPLR is just a measure of library inputs and outputs.  But why do the ratings at all except to highlight what I hope are exceptional libraries?      

The HAPLR system does not simply develop scores for libraries.

It offers a variety of reports to libraries that compare their performance to comparably sized libraries in their state and in the nation. Over the years, thousands of libraries have used standard or specialized reports to evaluate current operations and chart future courses of action. I am pleased that many libraries have reported that they improved their funding and service profiles with these reports.
 

Historical Measures

All together 312 libraries have made it to the Top Ten listing in all ten editions.  Fifteen entered the Top Ten for the first time this year.  They are listed below.

Popul. Category Library State
b) 250 K LOUDOUN COUNTY PUBLIC LIBRARY VA
c) 100 K EVANSVILLE-VANDERBURGH PUBLIC LIBRARY                        IN
c) 100 K WAYNE COUNTY PUBLIC LIBRARY                                  OH
d) 50 K CHAMPAIGN PUBLIC LIBRARY                                     IL
d) 50 K WESTERVILLE PUBLIC LIBRARY                                   OH
e) 25 K ALGONQUIN AREA PUBLIC LIBRARY DISTRICT                       IL
f) 10 K ELK GROVE VILLAGE PUBLIC LIBRARY                             IL
g) 5 K BURTON PUBLIC LIBRARY                                        OH
g) 5 K CANAL FULTON PUBLIC LIBRARY                                  OH
h) 2.5 K JOHN A STAHL LIBRARY                                         NE
h) 2.5 K ORANGE BEACH PUBLIC LIBRARY                                  AL
i) 1 K BERESFORD PUBLIC LIBRARY                                     SD
i) 1 K GRAND MARAIS PUBLIC LIBRARY                                  MN
i) 1 K ROCK CREEK PUBLIC LIBRARY                                    OH
j) 0 K EAGLE PUBLIC LIBRARY                                         AK

What is changed in this edition? 

  1. In this edition, I have stopped using imputed data.  What is imputed data and why does not using it matter?  The Institute of Museums and Libraries (IMLS) imputes data for individual libraries when the libraries themselves fail to report data elements.  They do the imputation based, among other things, on past reports of an individual library or based on the average for libraries within a population category.  Eliminating libraries that have imputed data for any measure in the HAPLR rating system means eliminating 1,284 libraries in the IMLS dataset. 

What has not changed?

  1. Electronic Measures.  I have still not included electronic measures as so many have urged me to do for so long.  I viewed what happened to the LJ Index and its use of electronic use; San Diego County got a five star ranking with clearly erroneous data.    Using the HAPLR methodology a single erroneous data point does not swamp the score as it does with the LJ Index.  So, for the next edition HAPLR will begin using the data for users of public internet. 

  2. Inputs and Outputs  HAPLR is still using the input and output measures. Some suggest that mixing inputs and outputs confuses what HAPLR measures.  Inputs like staffing levels and total spending are critical components of a library's operations.  Their inclusion greatly enhances the utility of the library reports that HAPLR provides. 

What will probably change in future editions          

Dropping some measures to determine rankings is not the same thing as dropping them for HAPLR Reports.  In fact, I intend to expand the measures used on the HAPLReports.  HAPLR has never been simply about the ratings.  From the beginning, libraries have been able to order standard or special reports. Thousands of libraries have used standard and specialized reports to compare their operations with other libraries around the country on a consistent basis.  These reports are used for planning purposes in many libraries every year.  

  1. WeightingI will most likely eliminate weighting of the factors.  Critiques have caused me to look at the factor weightings more closely.  I weighted the factors originally because it seemed to me that some factors are more important than others in considering library performance.   I still believe that to be true, but further examination of the data has shown that including the weighting factors has very little impact on the ratings for libraries.

  2. Percent Budget for Materials.  The percent of total budget devoted to materials will probably be dropped from the rating criteria but not from the individual library reports. 

  3. Periodicals.  Over ten years ago, when HAPLR was initiated, periodicals were more important to a library’s service than they are today.  Students used to take out back issues for homework and adults did so to catch up on magazines to which they did not subscribe.  The Internet has changed most of that.  Today, for the most part, only current magazines are perused in the library.  Use of back issues has gone to online sources.

  4. Volumes per capita.  When HAPLR started, the number of book volumes per capita was still the large majority of a library’s collection.   But as libraries diversify collections into audio, video, and downloadable content, this has become a less useful item.  I may expand the item to include other physical media like video and audio. 

  5. Collection turnover.  I will either change collection turnover to include other types of materials or eliminate it all together. 

  6. Circulation per hour & Visits per hour.  These two measures will probably be dropped as part of the re-design of HAPLR.  The hourly counts will remain in the reports because they are useful items for planners to consider. 

  7. Public Internet Users.  I hope that by the next edition the IMLS data will be sufficiently cleaned up to allow for use of this relatively new measure.  It has been needed for a long time.  I hope that the data will be cleared of the trouble that resulted in the LJ Index giving Five Stars to a library with clearly errant data. 

  8. Public Internet Terminals.  In an effort to continue to measure inputs as well as outputs in the HAPLR ratings, I hope to add the number of Public Internet Terminals as measure.  The number of terminals available to the public has clearly become a major issue for judging a library.  It will be a useful counterpart to keeping track of Public Internet Users. 

  9. Attendance.  I would very much like to be able to use the annual attendance data now being included in the IMLS dataset.  It is still a relatively new measure and I have been concerned about the skewed nature of some of the reports, but I hope it will be usable by the next edition.

  10. Spending on Electronic resources.  

  11. 0k population.  The smallest population category, under 1,000 population has always been a bit of a problem.  The per capita measures frequently become very high.  This happens often when a community is a tourist destination and the legal population is small, but the seasonal population is much higher.  In the end, the category will probably be retained because I hate to eliminate reports for that many small libraries. 

     

Last revised April 2010

 

 

 

Home Ratings Order Samples Press FAQ Misc

April 2010

© haplr-index.com
Webmaster: thennen@haplr-index.com