Thursday, June 17, 2010
More links
Eric Hellman asks if public libraries are in a death spiral. He reflects on his experience in industry when a major contraction took place. He suggests that cutting hours is counter-productive, and advocates more fund raising like NPR. I respect Eric and his writing, but it is clear to me that he has not had to manage in the public sector. Much of the public does not believe the bad news of budgets until it hits them. Been there, done that. He includes a list of links to articles about public libraries being in trouble. I have talked about some of my experiences in July 2008, (twice), August 2008, and even earlier embedded in a post on customer service.
I am no longer sure where I picked up this citation, but it has good advice for bloggers, Bloggers: 7 questions to ask before hitting "Publish".
There was also a thoughtful post about copyright by Laura Crossett with both some good information, and interesting insights and reflections.
One of my electronic friends posted a link to this article which simply demonstrates the wrongness of the Arizona bill and other efforts to target immigrants legal (like this kid) and others. After all, there is only a very, very small number of my friends, colleagues, and acquaintances whose ancestors (or they) are not immigrants. Certainly somewhere back there (in the 1800s) all my ancestors came from another country! This attitude scares the crap out of me!
Facebook privacy settings take another beating in this blog post from John Henry Clippinger. (Is that a pseudonym?) David Lee King also posted about the settings, with a screen shot and some cogent observations.
There is a great post for anyone thinking about freelancing. (It is a thought I entertain from time to time...) It is a good mix of philosophical and practical. [Note to self: see if there is a part two and/or three!]
And finally, I noted the issue with the California Digital Library and Nature Publishing Group. Steve Lawson was first on my radar with "UC to Nature Publishing Group: DROP DEAD." I then picked up on the story in the Chronicle of Higher Education. There are three which summarized the issue well for me, starting with Dorothea Salvo, and including both Eric Hellman and Steve Lawson. Eric's post includes links to actual documents. And here is the Library Journal summary of the dust-up.
Wednesday, August 05, 2009
Links from around the web
- Here is a thoughtful post from Helene Blowers about books and print, it has a catchy title, too: Future of the book is not a "container question"
- An interesting article from PW about the Google Book Search Settlement and a panel discussion at New York Public Library
- When you "buy" a download, can you keep it forever? Some think not. Here is an article from boingboing on the topic.
- Streaming video is displacing DVDs
But I wonder where does that leave the majority in Louisiana who do not have Internet at home? - He also has some cogent thoughts on Bing and Yahoo (As he suggests, I have started using Bing.)
- I picked this up off PUBLIB, where the poster noted that this conservative paper generally is against any taxes and increases in public spending, but does support public libraries. It is an interesting article.
- Archiving the transition of power in Alaska This shows the important role we libraries have in conserving today's information for tomorrow's researchers
- I am a little disappointed that I did not make the Top 100 list, and I have some quibbles, like librarian.net not being in the top 5! But there are some other obviously good choices. There were even a few I had not followed/found.
Friday, June 05, 2009
Links and miscellany
And now in categories!
Broadband:
Bringing Broadband to Rural America (the official FCC report)New technology and Web 2.0
Broadband Nation. A new blog about broadband issues.
Bringing in Broadband. The issues in one Florida county.
Mapping Broadband. This person/organization may well not be a friend for libraries.
Lobbying the FCC for access and no caps.
Paper Highlights Pros and Cons of Twittering at Academic ConferencesIntellectual Property issues (IP)
"librarians express affection through information"
Resolving the 80/20 dilemma "End users are spending less time on gathering the information they need – but their search failure rate is going up." A great article of importance to all librarians, but this one is focused on special/corporate libraries.
Technological accommodation of conflicts between freedom of expression and DRM: the first empirical assessment This links to a much longer PDF file on the Cambridge University web site
Search is too important to leave to one company – even Google Cory Doctorow in the Guardian
Study: Unselfish Individuals Benefit in Social Networks
9 simple suggestions for using social media
Twitter in the workplace. This is a presentation for government leaders on the use of Twitter.
IP rights and the Blind The US, Canada, and the EU try to limit the rights of blind people to use technology to receive written material -- Cory Doctorow on Boing BoingALA
IP: File sharing and Copyright. I have not read the full article (a link to the PDF is here), but the summary presents the intellectual property issues in file sharing in a new light. (Hmmm, maybe a full post is coming.)
Publishers are trying to avoid the Music industry's mistakes.
All Dressed Up with Nowhere to Go: A Survey of ALA Emerging LeadersGeneral Library stuff:
Mommy haven takes a hit in down economyThe Big Picture
How to love your library
The 'M' word always has good stuff about library marketing. Nancy Dowd does a good job, this one is on the future of the media we will need to deal with. {Memo to grammar caucus fans...I did that on purpose.}
Darien Library's new brand image was picked up from John Blyberg. Check out the other clients here.
Job seekers at the library. While this is not new, there are some interesting statistics at the end. I also have to comment that when I first looked at this site, I thought I was at NOLA.com which is the site for the New Orleans Times-Picayune.
Freemium A new way of thinking about library services and charging for them.
Google takes on Amazon from the New York Times...and it is only for e-books.Personal
Communicating a message. An interesting re-post from Stephen Abram on the differences that the wording of a message can make.
Free Range Librarian on where she is in her life and in her blogging life. It is actually a little similar to where I am.
Hot flashes -- a new perspective I found this one absolutely fascinating.
Want. Need I say more?
The rise and fall of LSU. I am not completely sure of the author's credentials, but it certainly is an interesting perspective on the positioning of state universities within the state power structure.
Wednesday, June 03, 2009
Silk Purses and Sows Ears? Assessing the Quality of Public Library Statistics and Making the Most of Them: PLA Spring Symposium 2009 - Morning I
Is is possible to develop a "library goodness scale?"
What is good, what is a great library? This is an interesting challenge to define.
In a library organization management’s responsibilities are:
defining goals;
- obtaining the needed resources;
- identifying programs and services to reach the goals;
- and using the resources wisely.
There are benefits and challenges: lots of performance measures -- most libraries have too many which are never used. (You have the authority to stop collecting data if it is not being used.)
A very important concept is "You get what you measure." He cited an example of police performance measurement. As a result of the measure used (minor quality of life issues) the community had man cops reporting pot holes – including the same pot holes day after day. The measure, reports filed, was incredibly high. The solving of crimes was not. As managers we need to refine the performance measurement system to reflect what you want.
Benefits and challenges: role of evaluation not to prove but to improve; provides feedback on actual performance; develops a culture of assessment. When data is disconfirming, report is often ignored rather than addressing the issue raised.
Efficiency & Effectiveness
Efficiency is the internal perspective: are we doing things right? Effectiveness is the external perspective: are we doing the right things? It is an important distinction.
The Library-centered view: how much, how many, how economical, how prompt?
Types of measures:
Leading is something that lets you forecast demand: pre-registration figures. In Joe’s opinion there is no relationship between inputs and outputs in libraries!
Leading indicator at reference: Very few libraries use reference data they have to change the staffing pattern at the reference desk. There is no leading data for reference queries...it may be the number of Google searches that month. He quoted OCLC Perception data on use of library reference as first source 3% of the time. You can forecast from past data trending. Should change staffing pattern, should get rid of reference questions....
A leading indicator could be a "high holds list" for items on order; another could be the school district calendar for staffing the reference desk.
Question on interpreting data when users asked what they want. Triangulation, partly asking what they want, customer satisfaction data, focus groups.
Measures need to be: SMART: Specific (accurate), Measurable, Action oriented, Relevant (clear), Timely
It is also important to review the data, and how it is collected and reported. In one library, the gate count suddenly doubled. When a manager went to check the manager discovered that there was a new staff member reporting it – the gate counted both those entering and exiting, and the former staff member correctly reported ½ of the number as the attendance. The new staff member did not.
Why do we use the data? There are several reasons: to help understand demand; to demonstrate accountability; to help focus; to improve services; to move from opinions to use of data, more responsive to customer needs; communicate value.
When we collect data we make some assumptions. For instance comparability (why does 3-week book count the same as 2-day DVD) [Joe also made an argument to not include renewals as part of circulation]; accuracy [how to count reference? ticks or argues to use gate count as an indicator; also argued for sampling--demonstrated busy-ness, need to demonstrate value] blow up reference desks....get rid of them.
Performance -- often bunch of numbers and no historical context, last 2-3 years of data.
Problem is failure to keep pace with ever rising expectations.
Larry Nash White presented next on the Library Quality Assessment Environment
He noted that he was raised by grandfather who was an efficiency expert.
Performance person in the library actually knows more about what is going on in the library. Statistics and metrics are like tight fitting clothes, they are suggestive, but not completely revealing.
History helps tells us where we have been. Most of what we measure we stole from somewhere else.
We have measured parts, how do we measure the whole. In 1934 Rider developed a way to maximize efficiency using costs. "If we don't assess things and do it correctly, then others from outside of the library will come and do it for us." (Rider 1934) About 100 library systems around the country are run by an outsourced firm (LSSI and others).
Google in 9 hours answers as many reference questions as all libraries in the
1939 was first customer service survey. 50s and 60s saw the quantitative crunch. Smile ratio as a measure? Especially when there are more smiles on the other side of the counter.
What is happening today? What are the influencing factors?
How many have enough resources (money, time, staff)? No one. [Great story about
Increasing stakeholder involvement is important. When you want to keep your stakeholders out, that is a bad sign. They bring in own perceptions, biases, etc. which you must work with.
Technology is neutral, it is intent which the value. How we use it to deliver service it is made good or bad. How effective is our technology service. Total cost of ownership studies. Anti-tick marks. Use technology to count wherever possible. Use automation system to count computer use, reference questions, directional questions. An ILS is really good at counting. Can do location by location and hour by hour.
We are always borrowing from someone else. Libraries are using what business world gave up years ago. And they are tools that were often designed for something else.
Time is affecting what we do.
More quantitative data is wanted by stakeholder, more qualitative data is wanted by profession. This is a tension/division.
A wider scope is needed to assess and improve the process. Dynamic alignment: held up knotted string, not a macramé -- used as an analogy for our performance assessment environment (not much give). Do you have the right things in place, counting the right things and giving the right answer. (Pulled in the right way, and it became a single string.) When we align our assessment we need to continually align because of the changes in the environment.
Future predictions
- More assessment.
- More quantitative data to support quality outcomes
- More intangible assessment. (Many things we do are intangible, and are important.) What would it look like if we started reporting the air.
- More assessment of organizational knowledge
- More assessment of staff knowledge (human capital) are we effectively assessing the use of that resource.
- Increased alignment of assessment process.
- [Intellectual capital. Human capital -- what people know. Structural value -- what is left when people go home. Value of the relationships: stakeholders, vendors, partners.] Report the value created. Wherever we spend money we need to report the value of what we do.
Ray Lyons then talked about Input-Process-Output Outcomes Models
IMLS has now embraced the
He showed several graphics including "Program Evaluation Feedback Loop." It is considered to be a rational process. It is also very stagnant which ignores political issues.
If you remember why you are doing this, you can often come up with your own answers to your questions.
Evaluation questions include "merit." Orr's model does not include stakeholders very well, they are listed as "Demand" How can you produce demand?
Performance Assessment is often blind to unintended consequences. Does not ask: what are the real needs of the community?
Input statistics, should only be used only in connection with outputs, only potential for services. Output statistics measure current level of performance.
Goals are often related to the statistics. Aren't you going to reach a point where you can no longer improve?
Interpreting output statistics: interpret in relation to goals and is left up to the library. There are no standards for evaluating quality or the value of the items. We also don't look at the relationships between the data elements. (Or don't trust the judgments we make.)
PLA Spring Symposium Notes
They need some editing, the first will come up today. I am working on the others.
Saturday, April 04, 2009
Silk Purses and Sows Ears? Saturday Morning
LJ Index: Ray Lyons
Contingent valuation definition: value that someone is willing to trade for something else. What else will equal the item in question.
Looking at measures in general...research project -- let's just look at one.
- Library ratings are contests
- Comparison to library peers
- Performance is not gauged according to objective standards
They are based on standard library statistics. They do not measure: quality, excellence, goodness, greatness, value.
They do not account for mission, service responses, community demographics, or other factors.
Selection of statistics and weightings are arbitrary. Assume higher statistics are always better, and adopta one-size-fits-all approach (all libraries are rated using a similar formula).
Simplistic measures are created primarily for library advocacy. They are subject to misinterpretation by library community, the press, and the public.
Current rating systems: BIX (German Library Association), HAPLR, LJ Index
It is a totally arbitrary method. The more different methods, the more different views of the world.
Uses library expenditure levels as peer comparison groups. If you chose population a similar distribution would exist.
Measures service output only. Libraries must "qualify" to be rated: pop over 1,000; expenditures of more than $1K; meet IMLS definition, and report all those to IMLS
Reference questions are statistically significantly different in correlation to other items. Look at outlying values most of which occur in the smallest libraries.
Indicators chosen: circulation per capita; visits per capita; program attendance; public Internet computer uses. If libraries do report data, can not be retrospectively added. This is a contest not a pure scientific event.
There are anomalies in the data, it reflects the "untidiness" of the IMLS data. Chose to do per capita statistics. It can be an unfair advantage/disadvantage depending on whether the official population accurately represents service population.
Libraries are rated by how far above or below the average a library is. Calculate the mean, standard deviation. Score given to data to show how spread out data is.
Create a standard score: compares visits to the mean and divide by standard deviation to get a score. Your score should not be influenced by the others in your group, and therefore this is not a real scientific evaluation process, and does not measure quality.
What is the point, data is old ... advocacy is the reason to do it. We are in a profession where technology is driving change. Perhaps we really need to change.
What can you squeeze out of the data we have? Is this what we should do?
(Hand out....adjustment to get rid of negative, then get rid of decimal point.) Number looks very precise, but it in not very precise.
Advocacy -- showcases more libraries, encourages conducting and publicizing local evaluation
Encourages submission of data, and emphasizes the limitations of the indicators.
The model is inherently biased. Measures service delivery. If other stats chosen, other libraries could move to the top. Comparison between groups is inherently impossible.
Encourages assessment and collecting of data not previously collected. How many can you list. This is a contest and not a rigorous evaluation. Five stars went to an arbitrary number. Partly determined by space in the journal.
Customer Satisfaction -- Joe Matthews
Customer satisfaction is performance minus expectations. It is a lagging and backward looking factor. Not an output or outcome, it is an artifact.
Survey: Can create own survey, borrow from others, use a commercial service (Counting Opinions), need to decide when to use.
Need to go beyond how doing, and ask about particular services, ask respondents how are they doing, open ended questions elicit a high response rate. Most surveys focus on perceptions rarely ask about expectations. (PAPE - Priority And Performance Evaluation)
Service quality: SERVPERF - Service perfomance; SERVQUAL - Service Quality (see handout); LibQUAL+ for academic libraries.
LibQUAL+ is web based, costs $5K per cycle, and public libraries who have tried it have generally been disappointed.
Telephone surveys are being abandoned as more and more people are dropping land lines (partly to avoid surveys).
Cannot do inferential analysis if response rate is less than 75%
Single customer satisfaction survey to loyal customers: How likely is it that you would recommend X to a friend or colleague using a 10 point scale. Net Promoter Scores (NPS) (handout)
The library fails 40% of the time (although for a range of reasons). One of the worst things is to tell people that there is a long wait for best sellers.
Look at wayfinding in the library. Hours, cleanliness as well as comfort and security are very important. One library put flowers in both the men's and women's restrooms. Review when you say no.
Take a walk in your customer's shoes, but remember that you are also wearing rose colored glasses.
Hotel staff are trained in making eye contact and greeting.
Friday, April 03, 2009
Silk Purses and Sows Ears? Assessing the Quality of Public Library Statistics and Making the Most of Them: PLA Spring Symposium 2009 – Afternoon
Measuring excellence is difficult.
Output statistics are not useless, but they cannot tell us about quality, excellence, or if there is a match between community need and services.
Interesting examples of use of data from the afternoon. One branch was looking at cutting. Circ per hour was same first hour or last hour, but all circ last hour was from one person, where there were many the first. Therefore closing last hour inconvenienced only one person. Or rearranged collection so easier to find, but while circ went up, reference questions went down (and patrons were more satisfied).
Cost benefit analysis has several advantages quantify monetary value of library services. How do you chose the economic value. Technical ideas: consumer surplus valuation; contingent valuation (how much would you pay for ... ) or Willingness to pay.
Select specific services delivered to specific audiences.
One advantage is that it is well known in the business community. It can be used over time for specific services or products. Cost can be high since it involves surveying the community. It is possible that you may choose and area to study, when you could be missing other areas which the community values more than you think.
Larry White
Estimated cost of performance assessment in Florida in 2000 was $16 million, but the state only gave $32 million in state aid, and therefore 1/4 to 1/2 of state aid was wasted.
Outcomes based performance measurement takes a long time to generate results...years.
Return on investment. Everyone hopes a high return on investment.
ROI is used by business. Happy stories and smiley kids did not work. Created an ROI. First year 6 to 1. Took a buck from commissioner, promised return. I created this (showed stats). Showed ROI, then told a story.
Combination of cost avoidance and return revenue wise. Used data from geneology/history room list, took local folks and multiplied by figures by tourism for local. Then estimated distant customers for overnight stays. Showed library accounted for $500,000 in tourism revenue. Then cost avoidance (average cost per book times number of circ -- because then people did not have to purchase).
Data mining for libraries gets to be an important role in the community. Need to use a combination of numbers and words in a creative fashion.
Can use it to justify what you want to do and to save what you want to do. It is scalable.
Lack of consensus in value and use. It is usually used defensively (preserving library funding) and reactively. We wait for disasters to tell the story.
Joe Matthews
Summer Reading Program
In Portland OR, 75,000 in program, raise $60K, fly family of 4 to Disney Land. Only 30% complete. Real outcome is improvement in reading level. Cost is high. Can be done with third party agency for confidentiality of kids.
Encourage to start thinking about outcome side of things do they spend more time reading, do they do better in school. One place does a survey of caregivers about perception of outcomes.
Statistics
Context, trends, history.
Age of collection as a stat.
Unintended consequences of performance measures
- assessment process consequences (survey fatigue)
- tactical process consequences
- strategic process consequences
Tactical consequences: operational (how you work); systems (can create an 'us v. them'; can look at forest and forget the trees); financial (ten to one return speech -- loaned out to economic development dept.); service ("new Coke"); organizational impacts (can bring good things).
Strategic consequences occur over a long period of time. Operation supported an unethical behavior to support the need to constantly increase circulation. Problem where assessment drives the mission rather than the mission driving the assessment.
Final of the day: Management Frameworks (Joe Matthews)
Three Rs: Resources, Reach, and Results.
Resources: how much money do we need to achieve
Reach: who do we want and where
Results: What do we want and why
Choose only two or three measures. It is important to think about customer segments. (other than demographics) they come with different needs and with different expectations.
Performance Prism (see handout) used in England and New Zealand and Australia
Balanced Scorecard
Financial perspective: saving time, reducing costs, gnerating new revenues, share of the pie
(see hand out....)
Building a Library Balanced Scorecard: Mission and Vision Statements; Strategy -- how to reach the mission; select performance measures; set targets or goals; projects or initiatives
Strategic plans often do not use the word strategy. Most use one of two approaches: conserative, reachable, or the scientific wild ass guess approach, but you may want to have a BHAG.
The scorecard is usually created by a small group with the results shared with the stakeholders.
Silk Purses and Sows Ears? Assessing the Quality of Public Library Statistics and Making the Most of Them: PLA Spring Symposium 2009 - Morning II
We need to tell our stories better.
Every one of the 2,000 FedEx outlets report several thousand data elements every day. It is collected electronically, compiled company-wide, and delivered to upper management with comparatives. Response lag of less than 12 hours. Took asessment and made it a value added service.
Walmart has new data server farm multi-petrabit storage. All data kept and stored for 2 years. When something is sold, a replacement item is leaving from the warehouse to replace it on the shelf.
More metrics need to be automated, and more frequently performed. If we don't figure out how to do it for ourselves, someone is going to come in and do it for us.
Ray Lyons
Challenges of Comparative Statistics
Choosing a comparable library - there is no answer.
We need to be as diligent as we can to get a satisfactory answer even if it is not as satisfactory as we want it to be.
It is also about accountability.
See book on municipal benchmarks in footnotes. (Organization is in favor of standards existing to see if you are doing ok.) If money is not attached to standards, there may not be a reason to adhere to standard.
Types of benchmarking: data comparison using peer organizations; and analyzing practices for comparable purposes; seeking most effective, best practices.
Benchmarking steps: define measures to us (who will be involved); identify appropriate partners; agree on ground rules (ethical issues); identify objectives; document profile and context; collect data; analyze data (including normalizing, e.g. per capita); report results; and then use them for change.
Is it important to think about community need and culture. Important to chose libraries which have the same service responses.
Need to both count what you want/need, but also need to report using the standard definitions so that the data can be compared. Another person noted that staff feel that what is counted is valued.
Peer measures: average individual income; college education; # women not working outside the home; school age children; geographic area
Study of what output measures that Ohio library directors used: 3 regularly: material expenditures, operating expenditures, circulation. [These are the easiest to find, and easy to define.]
Now moving to use of internet and job application sessions, use of computers.
Recommendation: at a minimum identify peer libraries:
- service area population
- key demographic characteristics
- library budget
- service responses
Some libraries are in "fertile settings" which can explain statistical performance.
Joe Matthews
Activity Based Costing
Handout based process:
Figure costs....salaries. Handout includes cost of library, which as a municipal library does not include utilities.
Friday, June 06, 2008
On the Road

Ever since I have been here, I have been talking about going "On the Road with Bob." Bob is the LEPMPL staff member who, five days a week, drives and collects the materials returned to the eight book drop locations around Eau Claire. [Update, 5/25/2014 - There are now 10 locations.] Take a look at the map, and you will soon realize what an incredible service this is to the community. Library staff empty the drops six days a week (Monday through Saturday). Monday through Friday, Bob does it. Bob is a retired library custodian who has been doing this now for about 4 and a half years. It is a great fit, he gets some part time work, and would normally be up at that hour. He is incredibly reliable, and committed to doing a great job.

Our first stop was at the supermarket right by my apartment. We then visited each of the book drops in a giant "anti-clockwise" circle around the City. (Look at the map, and you will see why I describe it that way.
One advantage of riding shotgun was that I was actually able to sit and look at what we were passing. It is rare that I am a passenger in the town, and when driving, I try to pay more attention to the traffic than the passing scene.
I have a few final comments on my adventure today....
I still find it incredibly wild how many book drop locations we have. People in this community do not have any idea how unique that is. Second, even though we get a good volume of returns through these book drops, people still have to come downtown to actually get their items, so it has not really affected our circulation, but I think it has helped reduce our loss rate. Third, the fit between a person and a job is critical. For this job, Bob is a great choice. He has all the right qualities and enjoys it! That is very important.
Added challenge to my non-Eau Claire readers: Is there any other public library which has as many off-site places to return library materials? Remember, the eight locations in Eau Claire are at convenience stores and grocery stores, not at branches or even other government offices. I contend that Eau Claire is unique and has more off-site places to return materials than any other public library.
Tuesday, January 29, 2008
Snow and Cold
Today, for about the third time in this decade, I made the decision to close the Library early. It was an interesting process, but for those not paying close attention to the weather here, the City Director of Public Works included this in his email on snow clean-up operations:
When the Mall closed, it was easier to make the decision since that was the standard which my predecessor used.URGENT - WINTER WEATHER MESSAGENATIONAL WEATHER SERVICE TWIN CITIES/CHANHASSEN MN
1053 AM CST TUE JAN 29 2008
IN COMBINATION WITH THE SNOW AND WINDS...BITTERLY
COLD AIR CONTINUES TO FLOOD INTO THE REGION. WIND
CHILLS OF 25 TO 40 BELOW ZERO ARE EXPECTED THIS
AFTERNOON WEST OF INTERSTATE 35 IN MINNESOTA. THESE
WIND CHILLS WILL SPREAD ACROSS THE ENTIRE
REGION TONIGHT AND LAST THROUGH WEDNESDAY MORNING.
THEREFORE A WIND CHILL ADVISORY IS IN EFFECT TONIGHT
AND WEDNESDAY MORNING.
A WIND CHILL ADVISORY MEANS THAT VERY COLD AIR AND
STRONG WINDS WILL COMBINE TO GENERATE LOW WIND CHILLS.
THIS WILL RESULT IN FROST BITE AND LEAD TO HYPOTHERMIA
IF PRECAUTIONS ARE NOT TAKEN. IF YOU MUST VENTURE
OUTDOORS...MAKE SURE YOU WEAR A HAT AND GLOVES.
Closing for weather is tough. Actually, any unexpected closing is always fraught with possible public relations faux pas.
I hate closing the library, but the safety of staff comes first.
Wednesday, August 22, 2007
The Open Door Director
One post caught my eye, and it is a LJ [that's Library Journal not Live Journal] column by the blogging Michaels (Casey and Stephens). It is called The Open Door Director.
It is so much the truth when they say "It's no longer enough for the library director simply to keep the place running. Today's director is politician and lobbyist, fundraiser and spokesperson, juggling all of these titles while administering a library." And that sure is true.
They cite Jackson County (Oregon) libraries which recently closed down as one example of how public libraries cannot assume that funding will continue. (The last interim director, Ted Stark arrives to start in nearby Menomonie at the beginning of next month.)
What they talk about is what I have always tried to do as a library director. Be out in the community. Make the community feel like they can have a say in the library. By making all parts of the community into "stakeholders." [Interestingly my new library has a recent tradition of doing "Stakeholder Events" to emphasize that feeling.]
I'm still working on getting all aspects of Library 2.0 into my head and heart. But it is reassuring to read that I am doing some of the right things.
Saturday, August 18, 2007
The Long Tail
I did find parts of it over-long, but many of the examples were fascinating (to me) only because of my omnivorous taste for facts (aka trivia).
If I feel inspired, I may write more on this topic, but my personal life is disjointed enough that sitting and thinking clearly is difficult for me at the moment.
Monday, May 14, 2007
Irony: Libraries and Classics
"The Back Page" is by Booklist editor Bill Ott. His topic is irony, and how he has to work hard to determine irony.
I found the juxtaposition of the two items ironic.
Friday, March 02, 2007
ALA Council Candidates - PUBLIB
In a private email after a recent comment about folks on the list running for ALA Councilor-at-Large, ALA Executive Board member (and good friend) Nann Blaine Hilyard identified the following as current PUBLIB subscribers who are on the ballot for Councilor-at-Large.
They are:
- Catharine Cook (Chickasha, OK)
- Nann Blaine Hilyard
- Michael McGorty
- Dale McNeill
- Melora Ranney Norman
- Marti Goddard (a friend from a prior ALA life!) [added]
- Sue Kamm (whom I misunderstood was standing for election again -- even if I now remember signing her petition) [added, link added 3/4]
- James Casey (whom I misunderstood was standing for election again [added]
If I missed someone, let me know. I will do a full list of my personal endorsements after I see the ballot.
[Added 2 more names at 12:15 pm CST]
{Added link to Sue Kamm's "I am a Councilholic" post on Sunday, 3/4}
Sunday, February 04, 2007
Famous for a Day
It is slightly scary to pick up the paper and see a color photo of yourself on the front page! On the other hand, it is good for the Library.