Research Article - (2023) Volume 8, Issue 4
Received: 16-Sep-2019, Manuscript No. JOMP-23-2500;
Editor assigned: 19-Sep-2019, Pre QC No. P-2500;
Reviewed: 03-Oct-2019, QC No. Q-2500;
Revised: 30-Jun-2023, Manuscript No. R-2500;
Published:
28-Jul-2023
, DOI: 10.37421/2576-3857.2023.08.206
, QI Number: JOMP-23-2500
Citation: Hurwitz, Gillian, Lesley Moody, Zahra Ismail
and Monika Duddy, et al. "Improving Clinical Response to Cancer
Symptom Screening: Leveraging Over Four Thousand Chart Audits for
Actionable Person Centred Care." J Oncol Med and Pract 8 (2023): 206.
Copyright: © 2023 Hurwitz G, et al. This is an open-access article distributed under the terms of the creative commons attribution license which permits unrestricted
use, distribution and reproduction in any medium, provided the original author and source are credited.
Introduction: Routine symptom screening for cancer patients using Patient Reported Outcome Measures (PROMs) is standard practice in Ontario to identify physical and emotional symptoms that can go undetected by clinicians. However, provider response to PROMs is essential to addressing symptom burden. To measure clinician response, a Regional Cancer Centre (RCC) chart audit process was developed to determine whether clinical teams acknowledged, assessed and/or addressed commonly experienced oncology symptoms outlined in the Edmonton Symptom Assessment System (ESAS).
Methods: Annually, RCCs received a chart audit tool with preset options. Sites audited charts for seven of the ESAS symptoms using a business intelligence tool to access patient charts based on sampling parameters. RCCs were required to audit charts of patients whose ESAS symptom scores were moderate to severe (4-10), with at least five charts in the moderate range (4-6).
Results: Overall, 4,679 charts from all 14 RCCs were examined in the FY 2016/17 and FY2017/18 audits (2,377 charts and 2,302 charts, respectively). Depression (45.5%) and anxiety (49.0%) were the least likely to be recorded in the patient’s chart, whereas pain (75.1%) was the most likely to be noted. Patients reporting depression and anxiety were the least likely to be offered assessments (49.8% and 51.1%, respectively) and interventions (47.0%, 46.5%, respectively). Patients reporting pain were the most likely to receive assessments (72.2%) and interventions (64.3%).
Conclusion: Chart audits help measure clinical response to PROMs, providing useful information on the gaps in care, including response to emotional symptoms and can inform local quality improvement initiatives.
Cancer • Oncology • Repositories • Proportion • Clinician
Oncology patients experience a high symptom burden and their psychological and physical symptoms are often underreported to or undetected by healthcare providers. At the point-of-care, symptom screening with Patient-Reported Outcome Measures (PROMs) can facilitate patient-provider communication, identify problems early and track symptom burden over time, with the ultimate goal of improving person-centred care. Having an established process for patients to report their symptoms to providers at point-of-care can help prevent symptom progression and adverse downstream consequences [1].
For instance, the integration of symptom screening into the routine clinical care of cancer patients has been associated with increased survival and decreased emergency department use compared with usual care. Additionally, web-based monitoring for lung cancer patients based on self-reported symptoms has improved overall survival due to proactive relapse detection and improved performance at relapse [2].
Since 2007, oncology patients in Ontario have been completing PROMs as part of their routine cancer care. Patients have been systematically reporting on commonly experienced cancer symptoms using the Edmonton Symptom Assessment System-Revised (ESASR) and the patient-reported Eastern Cooperative Oncology Group scale (p-ECOG). Of the 74 sites that provide cancer care in Ontario, 14 Regional Cancer Centres (RCCs) and over 50 partner site hospitals are currently collecting PROMs using a provinciallydeveloped electronic platform and are incorporating PROMs into routine clinical practice at point-of-care. In Ontario, over 40,000 symptom screens from cancer patients are collected each month, making cancer care Ontario’s database one of the largest patientreported outcomes repositories in the world [3].
Historically, a screening rate has been used as a performance indicator on the uptake of routine symptom screening within the Ontario RCCs. The screening rate is defined as the proportion of patients who complete their PROMs from all patients who had an eligible visit to the cancer centre in a given month. Using a symptom screening rate as a singular indicator has several limitations; one limitation is that a screening rate does not sufficiently indicate if PROMs completion led to appropriate clinical assessments and/or interventions. To address this gap, chart audits have been included in as a mandatory deliverable for all RCCs, with the objective of measuring clinician response to symptom screening. RCCs are required to complete the chart audit every year using standardized tools and adhering to specifications outlined by cancer care Ontario. The objective of this study was to review multi-year chart audit results and determine if patients reporting mild to severe symptom burden, according to ESAS-R, prompts a clinical response [4].
Audit tool creation
To create the audit tool that will be used by the RCCs for the chart review, CCO generated a list of possible symptom management data elements that would best capture practices in the provision of quality symptom management. The expertise of local clinical champions, along with a critical review of the existing guidelines and algorithms on best practices in symptom management, were used to determine the broad headings and data ields of the audit tool. The audit tool was separated into categories re lecting how clinician (s) respond to the patient’s symptom screen on any given visit: Acknowledgement (i.e., whether the symptom was recorded in the provider’s documentation), assessment (i.e., was the patient offered a symptom assessment?) and intervention (i.e., was the patient offered an intervention or management plan?). Auditors from each RCC populated the audit tool (a Microsoft excel spreadsheet) using preset drop-down menu options. The speci ic elements of the audit and its corresponding list of possible responses are included in appendix a given that chart audits are an annual quality improvement deliverable in the Integrated Cancer Plan (ICP) in Ontario, ethics approval was not required for the creation or use of the tool [5].
The Edmonton Symptom Assessment System (ESAS-R) domains
The main purpose of the chart audit was to assess clinicians response to patient self-reported symptoms on the PROM, ESAS-R. ESAS-R is a valid and reliable audit tool used to assist in the assessment of nine commonly experienced symptoms of cancer patients. On a scale of zero to ten, the patient selects the most appropriate number between two extremes (i.e., ‘no pain’ to ‘worst possible pain’). For the purpose of the chart audit, seven of the nine symptoms were audited (pain, nausea, depression, anxiety, dyspnea, fatigue and lack of appetite).
Wellbeing and drowsiness were not part of the audit. Each RCC audited at least 20 charts for each of the seven ESAS-R symptoms included in the audit. The sample required that all of the audited symptom scores be four or above, with at least five of the audited charts per symptom with scores in the moderate range (4-6) [6].
Study setting
The chart audit was conducted in a cross-sectional manner and was completed by each of the 14 RCCs in Ontario. A package, including the audit tool and data dictionary, was given to sites in advance of the audit period. This study examines two years worth of chart audit data (audit period of January 2017 to May 2017 and August 2018 to November 2018, respectively).
After the auditing period, sites were given one month for data compilation and recording, before submitting the completed audit tool to cancer care ontario via a secure file upload portal [7].
Abstraction procedure
An electronic symptom management business intelligence platform was used to generate patient health card numbers that met the aforementioned sampling requirements within the audit timeframe. For the 2016/2017 audit, RCC administrators accessed patient records meeting the sampling criteria by using the platform to filter patients based on symptom score, disease type and date of symptom screen.
For the FY 2017/2018 audit, cancer care ontario sent RCCs a pre-selected list of patients to audit, based on the sampling criteria. Charts were not selected for abstraction based on other demographics, such as gender, age and cancer type [8].
Analytics and dissemination of results
Upon receiving completed chart audit tools from the regions, patient data were de-identi ied and all data were aggregated. Provincial frequency summaries were provided for each of the seven symptoms and a cumulative summary for all symptoms (i.e., how often were interventions offered to patients with a certain score and/or symptom?). A provincial data report was shared with key regional stakeholders, including local champions, hospital leadership and clinicians.
Regional leads were encouraged to further analyze their RCCspeci ic aggregate and patient-level data and use the results to drive quality improvement initiatives within their RCCs, with the ultimate aim of improving person-centred care [9].
Descriptive statistics
Overall, 4,679 charts from all 14 RCCs were examined in the FY 2016/2017 and FY2017/2018 audits (2,377 charts and 2,302 charts, respectively).
Acknowledgement of symptoms
In total, ESAS symptoms were more likely to be mentioned in the provider’s documentation as symptom severity increased (N=2798). Moderate symptom scores (4-6) were less likely to be mentioned in the provider’s documentation (53.1%) compared with severe symptom scores (710) (64.4%). “Others/missing” data were also recorded, meaning that data were not available or were incorrectly inputted by the auditor [10].
Overall, 59.8% of all audited patients had their symptoms acknowledged by at least one of their healthcare providers at the time of completing their symptom screen (N=2797). Variability existed regarding the frequency of symptoms being mentioned in the provider’s documentation. For instance, emotional symptoms, depression (45.5%) and anxiety (49.0%), were the least likely to be recorded in the patient’s chart. Pain was the most likely to be noted in the provider’s documentation (75.1%) (Figure 1) [11].
Assessment of symptoms
In total, 59.7% of patients received some form of assessment in response to their ESAS screen (N=2797) and 34.9% of patients did not receive an assessment (N=1633). Patients reporting depression (49.8%) and anxiety (51.1%) were less likely to be offered assessments compared to patients reporting pain (72.3%) (Figure 2) [12].
Additionally, assessments were offered to patients expressing a severe symptom burden (62.5%) more often than to those expressing a moderate symptom burden (56.1%).
Intervention for symptom management
Overall, 52.9% of audited patients were offered some form of an intervention (N=2475) in response to their ESAS screen and of those patients, 1.6% declined the intervention offered (N=74). The frequency of interventions provided varied based on symptom type. Some patients expressing emotional concerns received interventions for depression and anxiety, such as referral to psychosocial providers and patient education (47.0% and 46.5%, respectively). Worth noting is that a small percentage of patients expressing depression and anxiety opted to decline interventions offered by clinicians (3.1% and 2.8%, respectively). Patients reporting pain were the most likely to receive an intervention (64.4%) (Figure 3).
The frequency of interventions offered differed based on symptom severity. Patients were less likely to receive an intervention if they reported an ESAS score in the moderate range (4-6) on any given symptom (50.8%). Patients were the most likely to receive an intervention if they reported a severe ESAS score (7-10) (54.4%). Examples of interventions included referral to other healthcare provider (s), referral to PSO support (psychologist, social worker, dietician, etc), prescription for medication, patient education, etc).
In 2016, cancer care Ontario introduced a revised chart audit tool to examine clinician response to symptom screening in Ontario, in accordance with the framework of symptom acknowledgement, assessment and intervention. Based on two years worth of provincial aggregate data, several patterns are clear. Not surprisingly, patients symptoms were more likely to be addressed if the symptom score was severe or if the symptom was pain. Physical symptoms, like pain, may be easier to operationalize and perhaps a clear management plan and referral pathway is in place for physical symptoms. Patients expressing depression and anxiety were the least likely to receive an assessment or intervention. This result is consistent with findings from the literature, as routine screening for complex issue such as depression, “require extensive strategies, such as decision-making aids, education and clear management plans and clinical pathways, in order to effectively improve patient outcomes.
The gap in care for emotional symptoms highlights the need for more established processes and enhanced provider competency in responding to patients reporting symptoms for depression and anxiety. Interestingly, patients reporting depression and anxiety were the most likely to refuse interventions, possibly due to perpetual stigma around emotional symptoms or the patient’s lack of desire to attend an additional appointment. Given the complex nature of emotional symptoms, cancer care Ontario is evaluating a multidimensional symptom-specific PROM, specifically the Patient-Health Question-9 (PHQ-9) for depression. Additional information from more detailed PROMs on emotional concerns may allow the clinical team to target assessment and determine if referral is necessary [13].
Local clinical champions anecdotally noted that the new audit tool made the chart audit process easier and provided RCCs with granular actionable items, which were, in turn, has used to drive quality improvement. For instance, some RCCs integrated their specific chart audit data and the provincial comparator in clinician education practices and performance measurement, as a mechanism to improve symptom screening practices. Further analysis of centre-specific chart audit data can elucidate areas of improvement based on clinic and disease type and provide an impetus for quality improvement initiatives [14].
Overall, the use of chart audits to understand response to symptom screening provides useful information on the gaps in care and can inform local quality improvement initiatives. Since chart audits are labor intensive, reducing the frequency of chart audits to every two years may be reasonable. Furthermore, if symptomspecific PROMS are implemented for emotional concerns, chart audits may be a mechanism to understand if there is an increase intervention for these more detailed measures. Conducting chart audits is laborious process, but provides rich insight into the quality of cancer symptom management across the province. In conjunction with the screening rate, cancer care ontario can continue to use chart audit data to improve symptom screening and management processes, with the ultimate goal of enhancing person-centred care.
The audit tool was slightly revised between FY 2016/2017 and FY 2017/2018, therefore, some of the fields are not complementary. Furthermore, the actual abstraction process for pulling charts differed from FY 2016/2017 to FY 2017/2018; the hospital administrators could self-select patient charts based on the selection criteria in FY 2016/2017, whereas CCO sent a pre-selected random sample of patient charts meeting the criteria to audit in FY 2017/2018.
A limitation inherent to the chart audit process is that results do not comment on the appropriateness of the intervention, but only whether an intervention occurred. Furthermore, these findings are based solely on what was documented as happening at the clinic visit and the data are not necessarily reflective of what actually happened during the clinic visit. Another drawback of this study is that the FY 2016/2017 tool included fields like “the patient’s most important symptom” and “symptom addressed on previous visit”, which were not clearly defined, resulting in diverse interpretations among auditors. Slight revisions to the FY 2017/2018 tool addressed this discordance.
The authors would to acknowledge Mindaugas Mozuraitis and faith lee for their support with the analytics of this project. Further, the authors would like to thank the Ontario Cancer Symptom Management Collaborative (OCSMC) leads for their efforts collecting chart audit data.
The authors have no competing interests to declare.
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
[Crossref] [Google Scholar] [PubMed]
Journal of Oncology Medicine & Practice received 142 citations as per Google Scholar report