This website uses cookies to improve your experience, deliver our services, and the anonymised analysis of our website usage. To opt out of analytical cookies select 'Allow only essential cookies'. Please read our cookie policy   
Audits > Acute organisational audit information > Data analysis and methodology

Data analysis and methodology

Lead clinicians were asked to collect data on the basis of a unified service typically within a trust or a ‘site’. For most trusts the ‘site’ was the trust itself. For some trusts there are several ‘sites’ each offering a discrete service. A site may include several hospitals. Please note ‘trusts’ is used as a generic term; however, it is acknowledged that in Wales, these are Health Boards.

Data was collected at site level within trusts using a standardised method. Clinical involvement and supervision at team level was provided by a lead clinician in each hospital who had overall responsibility for data quality. Data was collected using a web-based tool accessible via the internet. High data quality was ensured through the use of built in validations which prevented illogical data being entered. Services entered data describing their service on a specific date (for the 2021 audit this was 1 October 2021) and were given 5 weeks to enter and check data, after which period no changes were permitted.

Participating sites were measured against specific criteria for 10 key indicators, with the same key indicators used in 2019 and 2021. These key indicators were identified using the domains and key indicators from the 2016 audit, as well as recent research and evidence. The standards against which acute stroke services were assessed were outlined throughout the written report. They include the 10 key indicator standards, the 2016 NICE Quality Standards and the National Clinical Guideline for Stroke (RCP, 2016).

Data was reported for every data item from the audit at national and site level. The national median for each measure was given to enable benchmarking.  National results were presented as percentages, site variation was summarised by median and inter-quartile ranges (IQR), and denominators were given within the national results column (see here for statistical terminology used in SSNAP reports). Ratios of staffing numbers per 10 stroke unit beds (per 30 stroke unit beds for psychology) were given rather than staffing numbers per stroke unit, so as to allow direct comparison with national standards and other sites. To calculate numbers per 10 beds the whole-time equivalents (WTE) for each staffing discipline in a service was divided by the total number of beds used by stroke patients, then multiplied by 10. The same rule applied for staff numbers per 30 beds, but multiplied by 30.

To represent the care available to patients at sites which do not treat patients in the first 72 hours after their stroke, these sites were assigned the results of the site which provided this care from relevant sections. This applies to both the key indicator summary section and the full results section.

Find us

Sentinel Stroke National Audit Programme
Kings College London
Addison House
Guy's Campus
London
SE1 1UL

Support

0116 464 9901
ssnap@kcl.ac.uk