ISO 9000 Registrar Customer Satisfaction Survey
Despite all the grumbling about the ISO 9000 registration process that has always followed the registration industry, clients are actually satisfied with the level of customer service provided by their registrars, according to Quality Digest's recent survey of ISO 9000-registered sites in the United States and Canada.
As a magazine that has covered ISO 9000 from the beginning, we have heard all the complaints and rumors about particular registration practices or registrars. We expected to discover a wide range of customer satisfaction levels--from highly dissatisfied to highly satisfied.
To our surprise, we discovered that the level of customer satisfaction is quite high. Registrars scored well above average in every category, and when clients were asked if they would recommend their registrar to others or choose that registrar again, the average survey response was a very respectable 4.19 on a five-point scale. Even those registrars that fell below the 4.19 mean performed well, with scores that ranged from 3.5 to 3.97.
To conduct the survey, Quality Digest first created a list of 53 items compiled from pre-survey interviews with more than 30 randomly selected clients. All items, except for two about overall satisfaction, came from the clients themselves.
Quality Digest then faxed questionnaires to more than 15,000 registered companies in the United States and Canada. We received more than 1,800 responses, a 12-percent response rate. All participants were asked to respond to the 53 statements related to their registration or auditor experiences. Respondents evaluated each statement on a five-point scale from "strongly disagree" to "strongly agree," with a sixth response for "doesn't apply." For a complete look at the survey methodology, see "Survey Methodology."
About the data
The following series of charts illustrates registrar performance in several customer satisfaction categories. A mean and a list of companies that performed significantly above the mean is given for each category. There is 95-percent certainty that the actual average isn't off what's shown on the charts by more than 0.05. For about half of the categories, the interval is closer to 0.02 in either direction.
Our first impulse was to simply present all the registrars by name with the data collected for each. This seemed an easy choice because all registrars performed well; i.e., the mean was high and no registrar scored poorly. However, when we discussed the results with the registrars prior to publication, those that didn't perform as well as others were mortified, arguing that in a business as highly competitive as theirs, the slightest perception that they ranked below an industry mean would "kill" them. The head of one large registrar told us that his company has been working hard during the last six months to improve customer service and he feared that "heads would roll" at his office if there was any perception by upper management that the business wasn't meeting its goal of outstanding customer service.
After much discussion, we agreed that some readers might perceive registrars that fell below the mean as "bad" companies--a perception that, although far from the truth, could indeed damage those companies.
The compromise is to present the above-the-mean performers as benchmarks for the others. The data for those registrars below the mean is also shown, but without the company names. We believe that this data is necessary to provide a true sense of the point spread in customer satisfaction, which in most cases was small.
In addition to the eight to 10 registrars depicted above and below the mean for each category, about 25 registrars fall into the mean. For a complete list of registrars for whom survey results were collected, see "Registrars Surveyed" sidebar.
Correlating the data
One of the first things we hoped to discover was the correlation between individual survey categories and overall customer satisfaction. We define overall customer satisfaction as the willingness of clients to recommend or reselect their registrar. Survey items were divided into the following categories: administration, including responsiveness, scheduling and price; communication; helpfulness, including ease of contact and useful suggestions; industry knowledge; professionalism; and thoroughness.
Surprisingly, the highest correlation to customer satisfaction was in the administrative, communication and professionalism categories (see Table 1). The nuts and bolts of registration--industry knowledge and thoroughness--had less of a correlation. What this could mean is that ISO 9000 is still considered more a necessity than a choice. An "If I have to get this done, at least make it pleasant" attitude may predominate.
Looking at correlations by the 53 items rather by than categories, we see that fairness, responsiveness and professionalism are the best indicators of customer satisfaction (see Table 2).
Despite the complaints we commonly hear about how expensive registration can be, price had little impact on a client's decision to recommend or reselect his or her registrar (a correlation score of 0.196).
We were initially concerned that the size of the client company or the size of the registrar might have some bearing on customer satisfaction. In fact, this wasn't a factor. Satisfaction levels seem to be consistent regardless of registrar size or client company's size.
The survey indicates that responsiveness, a firm handshake and patience with a smile may mean more to customer satisfaction than a thorough auditor who knows the client's industry inside out.
None of the categories nor items showed exceptional correlation scores. This could mean that the ultimate customer satisfaction questions weren't asked, but since the survey items were generated by the clients themselves, it is more likely that customer satisfaction in the registration field is too nebulous to be categorized.
This was the highest-ranking category in terms of correlation with overall satisfaction. Within this category, the highest correlation item was responsiveness, followed by the timely return of phone calls and follow-up administration.
That administration had the highest impact isn't completely surprising. Administrative personnel are the first and last line of contact with the client. Pricing and scheduling start here, and billing and issuing a certificate end here. This survey indicates that a good administrative interface goes a long way toward heightening the perception of good customer service.
Communication (Fig. 2)
This category includes items such as sharing information, communicating with the client and with auditors, and ease of contact. With a mean of 3.99, this was the highest-scoring category in the survey. The high scores, coupled with the second-highest correlation to repurchase or recommend, make this category very important.
Because registration is such an expensive and complex process, subject to the scrutiny of managers and stockholders alike, communication is a priority. It's no coincidence that four of the five companies that scored above the mean in communication were also four of the five companies that scored highest in overall customer satisfaction: Good communication goes a long way toward relieving stress.
Professionalism (Fig. 3)
Overall, registrars did extremely well in this category. For nearly two-thirds of the items in this category, registrars were rated higher than 4. Important baseline characteristics like the auditors being "professional," having "good cross-functional knowledge" or being "technically qualified" ranked highly, as did less specific attributes like patience and fairness on the parts of the auditor and the audit, respectively. This seems to suggest that the caliber of auditors hired by registrars is quite high.
As with "industry knowledge," "professionalism" had a narrow range of customer satisfaction scores (0.36). This, too, seems to indicate that registrars are hiring well.
Thoroughness (Fig. 4)
This category covered such items as the objectivity and thoroughness of the audit. The fact that this category had a high mean score indicates that auditors are doing what they are supposed to do: going through the clients' processes with a fine-toothed comb.
Industry knowledge (Fig. 5)
The correlation of this category to customer satisfaction was low. However, the item regarding the application of ISO standards to a specific business stood out in terms of its rating with clients (4.08) as well as with its correlation to reselect/recommend (0.523).
Although seemingly unimportant in terms of customer satisfaction, these results do shoot down a complaint we frequently hear: Auditors don't understand how ISO 9000 applies to a particular industry.
Industry knowledge was one of the lowest-scoring categories (3.63) but also had the narrowest range of customer satisfaction scores (0.37). This may point to a consistency throughout the registration industry in hiring auditors who are more or less equally knowledgeable in their particular industry sector.
That said, the low scores should tell registrars that this area needs work.
Helpfulness (Fig. 6)
This is a contentious, politically sticky category: The helpfulness category contained several items regarding coaching or making suggestions. According to ISO/IEC Guide 62, although auditors or registrars can provide general information that could be helpful, they cannot make specific suggestions to clients--how to correct a nonconformance, for instance.
Because we don't know what a client may have meant when they responded to an item such as "The auditor gave suggestions on how to strengthen weak areas," we don't know whether specific auditors are violating Guide 62's prohibition on providing consulting, or simply providing helpful information as allowed by the guide. Scores below the mean could indicate that some registrars err on the side of caution, being careful to present just the facts.
The wide range of registrar scores in this area raises an important question: Are certain registrars perceived as unhelpful because of their strict interpretation of Guide 62? If so, then are those with higher scores using a more lax interpretation?
Obviously, clarification of Guide 62 is needed, a fact that Joseph Dunbeck, CEO of the Registrar Accreditation Board, acknowledges. "Guide 62 is, unfortunately, not very explicit on what it means by consulting," says Dunbeck. "As accreditors, we are trying to get CASCO [ISO/IEC Committee on Conformity Assessment] to clarify Guide 62."
The president of one registrar told Quality Digest that he believes clients want and deserve direction for improving their processes and that this can be achieved without violating Guide 62. This registrar plans to "push Guide 62 to the limit."
Rather than worry about the specifics, it's more important to note that registrars ranked reasonably high in helpfulness and leave it at that. Without interviewing the clients, it's impossible to speculate further.
Despite the debate over Guide 62, helpfulness as a category had little impact on satisfaction.
Overall customer satisfaction (Fig 7)
Figure 7 shows the overall response to the following two items:
"If I had to do it again, I would recommend this registrar to others."
"If I had to do it again, I would choose this registrar."
The registrars that ranked above the mean for overall customer satisfaction are Orion Registrar Inc. of Arvada, Colorado; TÜV Essen of San Jose, California; NSF-ISR of Ann Arbor, Michigan; SGS-ICS of Rutherford, New Jersey; and DNV Certification Inc. of Houston. (See Figure 7.)
It's interesting to note that these registrars span the entire range of sizes as judged by number of clients. Orion is a relatively small registrar with about 100 registered sites, whereas DNV has more than 3,200 registered sites.
Notice that the registrar farthest below the mean scored a 3.54 (where a 3.0 equal neutral) on these two items. This means that its clients still have a positive opinion on recommending or reselecting this registrar.
On the trail of continuous improvement
This survey indicates that the state of the registrar industry with respect to customer satisfaction is good overall, with several outstanding performers in each category.
The weakest category is helpfulness, although, as explained above, this category is problematic because it involves an interpretation of Guide 62.
As indicated by the industry knowledge and professionalism categories, hiring practices across registrars are consistent, with each registrar receiving close to the same amount of customer satisfaction for these two metrics.
The intent of this survey was to not only provide readers with some idea of how the industry was doing and who some of the key performers were, but also provide the industry as a whole with some idea of how each registrar stacks up to the next. To encourage the registrars to use this survey as we intended--that is, as a tool for continuous improvement--we provided each registrar with the complete survey results and a listing of its rank within the industry.
With one exception, all registrars we spoke to about the survey, including many who performed below the mean in one or more categories, were excited about the results and believed that the survey will be a useful tool. As one put it: "We do customer satisfaction surveys all the time, and the results are good. But whenever we show them to management, they always ask us, 'Yeah, but how is our competition doing?' Now we know."
This survey scared a few registrars. But in the end, registrars shouldn't be afraid to have the same level of scrutiny applied to them as is applied to their clients. Terms like "best practices" and "benchmarking" practically got their roots in the quality industry. This survey is intended to be a yardstick by which registrars can objectively assess their service, not one to be rapped across their knuckles. When used in that context, this information can be a good tool to improve registrar quality.
This data is a start. Let's use it well.
|[QD Online] [Registrars] [DOE] [Sample Plans] [ISO Software] [Federal]|