The SCANNER quality score is reported for each contractor, on the basis of a set of rules which are described in this section. Because some of the scoring is applied on a per device basis and some applied per contractor, for reporting purposes the average score over all devices operated by that contractor is obtained for use in the calculation of the overall quality score.
To conduct SCANNER surveys each survey device is required to conduct yearly accreditation tests . All accredited devices are subject to stringent Quality Assurance checks throughout the survey year. These QA checks include: monthly contractor primary site surveys; weekly contractor survey checks, and auditor’s quality assurance of the network surveys. Further information on the quality assurance checks can be found in volume 4 of the SCANNER specification.
There are four rules applied to determine the quality score for each device. Each survey device begins with a quality score of 100. On a quarterly basis the quality score for each survey device is calculated using these rules. Each rule triggers demerits which are subtracted from the initial quality score.
1. The first rule is derived from the outcomes of the SCANNER accreditation tests and hence a score is obtained for each survey device. The SCANNER accreditation tests each device against a set of defined requirements. However, as would be expected for any system as advanced and complex as a SCANNER survey device, there are areas identified within the accreditation where the device is deemed satisfactory, but the test body considers that it would be desirable to make further enhancements to the device. These recommendations are made in a document called an Improvement Action Plan (IAP), which is issued with the accreditation certificate. The IAP includes a timetable for the actions. When an IAP action is not responded to by the agreed date 1 demerit will be awarded to that SCANNER survey vehicle.
2. Each device is required to carry out a quality assurance survey on their primary test site each month in which SCANNER surveys have been carried out. If a monthly primary site QA survey is not carried out 4 demerits will be awarded.
3. The results of the monthly primary sites are required to be delivered to the auditor within 14 days completion of the survey. If data from a completed monthly primary site is delivered late to the auditor 0.5 demerits will be awarded.
4. Each accredited device is required to conduct a weekly survey and the Auditor requests a quarterly report of the weekly surveys successfully completed. If less than 90% of the weekly checks are completed 0.5 demerits will be awarded and if lees than 80% are completed 1 demerit will be awarded.
The fifth QA scoring rule is based on the Auditor’s QA of the network surveys and is assessed for each contractor. On delivery of completed network surveys to a Local Authority the contractor is also required to deliver the data to the Auditor for assessment. Each quarter the Auditor requests a report of the completed surveys delivered to the Local Authority and compares this with the number of LA datasets that have actually been provided to the Auditor. Each dataset not delivered to the Auditor results in 0.5 demerits.
The average demerits over all the survey devices operated by the contractor are calculated. This is to prevent a contractor running several devices being excessively penalised as a result of a summing of demerits, which would unfairly represent the quality of that contractor. The averaged demerits for the devices are combined with the demerits from the Auditors QA assessment (rule five) to obtain a total number of demerits. These demerits are subtracted from the starting score of 100 to give an overall quality score.
In summary the quality score rules are as follows, and a worked example is given below.
|SCANNER Quality Score - Rules|
|1||Accreditation||Formal response required for each accreditation IAP action by the agreed date (date agreed by formal acceptance of IAP by survey contractor)||1 demerit for each action not responded to by agreed date||per vehicle|
|2||Monthly QA||1 primary site survey per month when device used for SCANNER surveys expected each survey year||4 demerits per missed site in any quarter||per vehicle|
|3||Monthly QA||Delivery of each completed primary site within 28 days1 of test||0.5 demerits per each site delivered late||per vehicle|
|4||Weekly QA||Quarterly statement delivered to Auditor of weekly QA sites completed||0.5 demerits for <90% done
1 demerit for <80% done
|5||Auditor's QA||Quarterly statement of data delivered to clients||0.5 demerits for each LA for which data has been delivered that has not been delivered to the Auditor||per contractor|
• In certain circumstances; for example, a significant improvement in delivery/responses from a contractor; the Auditor may re-calculate the quality score within a quarter to reflect the improvements being seen.
• In exceptional circumstances (for example, mechanical failure of device or staff illness) the Auditor may not award demerits if the circumstances resulted in a contractor in not being able to implement the full QA process, despite their best efforts.
• Where a contractor is not carrying out surveys (e.g. holiday periods) the contractor is not required to deliver all of the QA data (monthly and weekly QA site tests). This is taken into account by the Auditor when calculating the quality scores.
The SCANNER specification requires monthly primary sites to be delivered to the Auditor within 14 days of completing the test. For the initial introduction of the SCANNER quality score a 28 day delivery window has been used to reflect the delivery timescales achieved by current contractors