Before you run it at your institution, you must personalise it for your library to ensure the results are meaningful and the respondents are able to answer the questions.
You should insert the name of your library and director where indicated. The questionnaire refers to “the Library” throughout, if your library is known by a different name, you should replace “the Library” with this name (this is particularly important in converged services where Library may be viewed by some as only a part of the service, not the whole). You should go through each question and answer to make sure that the language used, especially the terminology, will be understood by your respondents to have the intended meaning.
Questions 1 and 2 require that you provide a list of answers that are appropriate to your situation. Question 1 is used to aggregate the responses by team. You should choose team names that staff members will identify with, and an appropriate level of granularity for the results to be useful to you.
Question 2 is used to aggregate responses by the level of the staff member within the organisational hierarchy. You should choose an answer list that will do this and be meaningful to your situation (the questionnaire used Grade, but you may use job title, or any other description).
If you have a multi-site service you may want to add a third question to the ‘About You’ section – asking where respondents work. You can then aggregate the results by location, if you want to.
The instrument should be administered using an online survey tool, such as Survey Monkey, Bristol Online Surveys, or an in-house application.
In order to provide a complete and accurate picture as possible, you should administer the questionnaire to all members of staff at your library and aim for a 100% response rate.
Questionnaires of this nature receive the best response rate if run on a relatively short timescale (e.g. three weeks), though you will need to consider the timing to ensure no particular groups are unable to complete it.
Analysing the results
You must take care to ensure that the data provided by respondents is held anonymously and securely in accordance with data protection rules. This is your responsibility.
You must also take care to ensure that the minimum number of people have access to the raw data, as it would be possible in most libraries to determine who had provided a particular response by combining the responses to the attribute questions.
The survey administrator should aggregate and analyse the data before reporting it to anyone.
The responses to each question on the instrument should be aggregated. The mode average response (i.e. most frequent) is taken as the ‘result’. You should use the rubric for mapping answers on to the level of an element of the Quality Maturity Model.
Three of the elements of quality culture (8.1, 8.7, 8.8) do not have questions on the instrument. Instead, these are assessed by cross tabulating the answers to specific other questions by team membership and/or level within the hierarchy.
If the responses are spread over a number of answers, the results should be cross-tabulated against team membership and level in the hierarchy to see if this produces different responses between groups and the same responses within groups. If so, these differences should be reported. If no groupings can be determined, then the main modal responses should be reported.
Assess your library's location on the QMM.
The Quality Culture Assessment Instrument is a survey of 43 questions. All but one of the questions requires the respondent to select an answer from a multiple-choice list. All questions are mandatory. Most respondents find it takes around 15 minutes to complete the questionnaire.
Presenting the results
The results should be presented as locating the library on the Quality Maturity Model. This enables you to see both where you are on the road to a culture of quality, and the next stage forwards.
Download the Word version of the Quality Culture Assessment Instrument to personalise it.