Abstract
Purpose: This study analyzed and quantified the sources of electronic health record (EHR) text documentation in ophthalmology progress notes. Design: EHR documentation review and analysis. Methods: SETTING: a single academic ophthalmology department. STUDY POPULATION: a cohort study conducted between November 1, 2016, and December 31, 2018, using secondary EHR data and a follow-up manual review of a random samples. The cohort study included 123,274 progress notes documented by 42 attending providers. These notes were for patients with the 5 most common primary International Statistical Classification of Diseases and Related Health Problems, version 10, parent codes for each provider. For the manual review, 120 notes from 8 providers were randomly sampled. Main outcome measurements were characters or number of words in each note categorized by attribution source, author type, and time of creation. Results: Imported text entries made up the majority of text in new and return patients, 2,978 characters (77%) and 3,612 characters (91%). Support staff members authored substantial portions of notes; 3,024 characters (68%) of new patient notes, 3,953 characters (83%) of return patient notes. Finally, providers completed large amounts of documentation after clinical visits: 135 words (35%) of new patient notes, 102 words (27%) of return patient notes. Conclusions: EHR documentation consists largely of imported text, is often authored by support staff, and is often written after the end of a visit. These findings raise questions about documentation accuracy and utility and may have implications for quality of care and patient-provider relationships.
Original language | English (US) |
---|---|
Pages (from-to) | 191-199 |
Number of pages | 9 |
Journal | American journal of ophthalmology |
Volume | 211 |
DOIs | |
State | Published - Mar 2020 |
ASJC Scopus subject areas
- Ophthalmology
Access to Document
Other files and links
Fingerprint
Dive into the research topics of 'Electronic Health Records in Ophthalmology: Source and Method of Documentation'. Together they form a unique fingerprint.Cite this
- APA
- Standard
- Harvard
- Vancouver
- Author
- BIBTEX
- RIS
In: American journal of ophthalmology, Vol. 211, 03.2020, p. 191-199.
Research output: Contribution to journal › Article › peer-review
}
TY - JOUR
T1 - Electronic Health Records in Ophthalmology
T2 - Source and Method of Documentation
AU - Henriksen, Bradley S.
AU - Goldstein, Isaac H.
AU - Rule, Adam
AU - Huang, Abigail E.
AU - Dusek, Haley
AU - Igelman, Austin
AU - Chiang, Michael F.
AU - Hribar, Michelle R.
N1 - Funding Information: All authors have completed and submitted the ICMJE form for Disclosure of Potential Conflicts of Interest and none were reported. Funding/support: Supported by an unrestricted departmental grant from Research to Prevent Blindness (New York, New York), and by US National Institutes of Health (Bethesda, Maryland) grants R00LM12238, P30EY10572, and T15LM007088. Financial disclosures: MFC is a member of the Scientific Advisory Board for Clarity Medical Systems (Pleasanton, California); and is a consultant for Novartis (Basel, Switzerland); and is a member of Inteleretina (Honolulu, Hawaii). All other authors have reported that they have no relationships relevant to the contents of this paper to disclose. Funding Information: The present authors, building on prior work characterizing the time burden and redundancy of ophthalmic clinical documentation, 20 , 21 , 31 , 34 present the following 3 key findings: 1) most EHR progress note text is copied or generated from pre-existing templates rather than manually entered; 2) staff enter a significant proportion of ophthalmology progress note text; and 3) a significant proportion of EHR text entered by providers is done after the patient leaves. The first key finding is that imported text entry comprises the majority of clinical note text for both new and return patient notes, with most of the imported text coming from templates. Imported text comprised 77% of new patient notes and 91% of return patient notes ( Table 1 ), which closely matches findings in EHR studies in internal medicine and orthopedics. 28 , 35 Although the percentage of imported text varied by subspecialty, most of the note text in all subspecialties consisted mainly of imported text. Import technologies improve documentation efficiency, structure, and consistency but may negatively impact note accuracy and patient safety. 26 , 28 , 30 , 36 Specifically, templates and other import technologies may contribute to redundancy between notes ( Figure 2 ). Import technologies may also lead to overly lengthy notes in which new and important clinical information is a small percentage of the note and difficult to identify. Recent studies have shown that providers struggle to manage the amount of information available in the EHR. 37 , 38 In clinical practice, reviewing notes with excessive templated text may contribute to increased time requirements. 5 , 14 One survey found that 70% of physicians reported reviewing notes took more time with EHRs than with paper. 39 In addition, reviewing notes consisting mainly of imported text can be challenging for determining what actually happened during the clinical encounter, versus what was simply populated by import technology (comprehensive review of medications, detailed examinations, problem lists, and so forth). 8 , 40 This reliance on imported text is illustrated in Figure 2 A, where most of the note, including the assessment and plan, is composed of imported text, making new information difficult to identify. Furthermore, this imported text may propagate errors, particularly when large blocks of text are copied forward. 36 Although possible consequences of copy-and-pasted text are well documented, 26 most imported text in the present study was templated text ( Table 1 ). Although where the small amount of copied text came from in New Patient notes was puzzling, an informal post hoc analysis of text copied into the 40 new patient notes included in the manual review revealed that most of this copied text appeared to be from other structured parts of the patient record (e.g., medication lists, laboratory results) rather than from other notes. The abundance of templated text may be driven by the use of EHR progress notes for assessment of quality and billing metrics, making it easier to enter text to meet regulations. 5 , 18 , 20 , 30 , 41 Although templated text may be more accurate than manual typing (e.g., importing a medication list), it may create other problems. For instance, one study found that providers used the medical record to create a “memory picture” of a returning patient. 42 Use of templated text that results in similar notes for different patients (e.g., Figure 2 B) may reduce the ability of the medical record to support this function, which has a negative impact on the relationship between providers and individual patients. However, the same study noted that structure can be beneficial to the medical record. 42 This suggests that templated text is useful for adding structure to the note but not necessarily for generating large amounts of imported, similar text. The use of well-designed templates can create complete notes, 43–45 but the authors believe excessive use of import technology can mask relevant data which might be more selectively reported when entered into notes manually. The use of nonprogress note data fields for billing and quality metrics is one potential approach to restore the integrity and relevancy of the clinical progress note. The second key finding is that staff enter a significant proportion of ophthalmology progress note text. In this study, staff wrote on average 68% of new patient notes and 83% of return patient notes ( Table 2 ). For the manual review, staff authored 50% of new patient notes and 65% of return patient notes. The large portions of notes authored by staff raise concerns about the ability to identify the author. 46 In paper charts, authorship is often recognized by handwriting, but this is often not immediately apparent in EHRs. Providers commonly solve problems using “data-driven reasoning” accessing all available information to inform their decision making process, including authorship. 47 EHR attribution data used to generate this study identifies authorship but is often not available in real-time during patient visits, nor is it available for notes created in older versions of the EHR at this institution. It is also not available for electronically transferred notes from other EHR systems or for printed or faxed copies of notes. Simple methods for identifying different authors, such as font color and type, may help provide greater transparency of note authorship. Our final key finding is that providers continue to document notes after the clinic visit. Of the manually reviewed text authored by providers, large portions (35% new, 27% return) were added after the visit ( Table 3 ). This finding is consistent with a previously published study reporting that providers spent 43% of time on the EHR outside of the clinical session and with a previously published study from our group showing that providers completed 1.6 hours out of 3.7 total EHR hours after clinic. 5 , 20 , 25 Documentation after the patient encounter may result from the negative impact of EHRs on clinical efficiency, making it difficult to finish during the patient examination. 5 , 14 , 48 Providers report that productivity and revenue have decreased after EHR implementation compared to that using paper notes. 16 These efficiency and financial challenges coupled with documentation work following the clinic visit 23 , 25 may negatively impact provider satisfaction and increase stress and burnout. 23 , 38 , 49 Documentation following clinical visits may also have implications for patient safety due to inaccurate or incomplete recall, especially for examination findings or key discussion points. This study showed that providers used similar amounts of both manual and imported text after the visit, suggesting that providers are still actively documenting after the visit rather than merely reviewing and electronically signing notes. Reliance on templated text to fill in details during after hours may produce notes that do not accurately represent what actually occurred during the clinical visit and may lead to errors and medicolegal liability. 8 The relationship between after-hours documentation and note accuracy warrants further study. Interestingly, there was a wide range of documentation habits among the providers in our study, with some completing nearly all documentation during a clinic visit, and others completing the most documentation after the visit (data not shown). For example, among the providers with manually reviewed study data, the provider that completed the highest percentage of documentation during the office visit also relied heavily on import technology (93% new patients, 95% return patients). Thus, while completion of charts in real-time appears achievable, it may require heavier use of imported text. Real-time completion of charts may also be limited by provider knowledge and efficiency with the EHR, length of patient examination and discussion, concerns for interference with the provider-patient relationship, and level of documentation detail required by certain subspecialties. 5 , 20 The authors hope these study findings lend evidence to discussions of changes in EHR documentation policy. The Centers for Medicare and Medicaid Services (CMS) recently proposed payment changes to allow for lower documentation requirements, including focused history and examination components and review of support staff documentation without re-entering it. 50 Although the financial consequences for such changes have yet to be determined, the adjustments in documentation requirements may allow for less reliance on lengthy templates, facilitate more transparency in authorship, and decrease documentation time. This study had several limitations which future work could address. First, the study was retrospective and did not allow for causative conclusions. Second, both the large-scale analysis of ~125,000 progress notes and manual review of 120 notes occurred at a single academic study site, which may not be representative of all ophthalmologists in all settings. That said, the manual sample had a largely similar note composition and authorship results to our larger data set, which suggests the sample was reflective of typical provider notes in this setting. Additional research is warranted to examine generalizability of study findings. This study did not account for scribes in the analysis, which warrants its own study. Third, this study was not designed to examine the utility or harm of different methods of inputting note text. Future studies are needed to test specific hypotheses such as those that importing text enhances note structure but leads to greater rates of including outdated information. Finally, this study excluded office visits with trainees, which may have different patterns of documentation with respect to text source and time. 35 Overall, these findings raise concerns about patterns of ophthalmology EHR documentation. Most of the text in new and return patient notes is created using imported text, which may lack accuracy and mask relevant data. Support staff author large portions of notes, but attribution data are often not readily visible. Providers complete large portions of documentation following clinical visits, which raises questions about accuracy of the note, efficiency of workflow, and quality of life for providers. The authors believe there are important opportunities for collaboration among ophthalmologists, system developers, informatics experts, and policymakers toward improving the utility of clinical documentation in EHRs. All authors have completed and submitted the ICMJE form for Disclosure of Potential Conflicts of Interest and none were reported. Funding/support: Supported by an unrestricted departmental grant from Research to Prevent Blindness (New York, New York), and by US National Institutes of Health (Bethesda, Maryland) grants R00LM12238 , P30EY10572 , and T15LM007088 . Financial disclosures: MFC is a member of the Scientific Advisory Board for Clarity Medical Systems (Pleasanton, California); and is a consultant for Novartis (Basel, Switzerland); and is a member of Inteleretina (Honolulu, Hawaii). All other authors have reported that they have no relationships relevant to the contents of this paper to disclose. Publisher Copyright: © 2019 Elsevier Inc.
PY - 2020/3
Y1 - 2020/3
N2 - Purpose: This study analyzed and quantified the sources of electronic health record (EHR) text documentation in ophthalmology progress notes. Design: EHR documentation review and analysis. Methods: SETTING: a single academic ophthalmology department. STUDY POPULATION: a cohort study conducted between November 1, 2016, and December 31, 2018, using secondary EHR data and a follow-up manual review of a random samples. The cohort study included 123,274 progress notes documented by 42 attending providers. These notes were for patients with the 5 most common primary International Statistical Classification of Diseases and Related Health Problems, version 10, parent codes for each provider. For the manual review, 120 notes from 8 providers were randomly sampled. Main outcome measurements were characters or number of words in each note categorized by attribution source, author type, and time of creation. Results: Imported text entries made up the majority of text in new and return patients, 2,978 characters (77%) and 3,612 characters (91%). Support staff members authored substantial portions of notes; 3,024 characters (68%) of new patient notes, 3,953 characters (83%) of return patient notes. Finally, providers completed large amounts of documentation after clinical visits: 135 words (35%) of new patient notes, 102 words (27%) of return patient notes. Conclusions: EHR documentation consists largely of imported text, is often authored by support staff, and is often written after the end of a visit. These findings raise questions about documentation accuracy and utility and may have implications for quality of care and patient-provider relationships.
AB - Purpose: This study analyzed and quantified the sources of electronic health record (EHR) text documentation in ophthalmology progress notes. Design: EHR documentation review and analysis. Methods: SETTING: a single academic ophthalmology department. STUDY POPULATION: a cohort study conducted between November 1, 2016, and December 31, 2018, using secondary EHR data and a follow-up manual review of a random samples. The cohort study included 123,274 progress notes documented by 42 attending providers. These notes were for patients with the 5 most common primary International Statistical Classification of Diseases and Related Health Problems, version 10, parent codes for each provider. For the manual review, 120 notes from 8 providers were randomly sampled. Main outcome measurements were characters or number of words in each note categorized by attribution source, author type, and time of creation. Results: Imported text entries made up the majority of text in new and return patients, 2,978 characters (77%) and 3,612 characters (91%). Support staff members authored substantial portions of notes; 3,024 characters (68%) of new patient notes, 3,953 characters (83%) of return patient notes. Finally, providers completed large amounts of documentation after clinical visits: 135 words (35%) of new patient notes, 102 words (27%) of return patient notes. Conclusions: EHR documentation consists largely of imported text, is often authored by support staff, and is often written after the end of a visit. These findings raise questions about documentation accuracy and utility and may have implications for quality of care and patient-provider relationships.
UR - http://www.scopus.com/inward/record.url?scp=85077919457&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85077919457&partnerID=8YFLogxK
U2 - 10.1016/j.ajo.2019.11.030
DO - 10.1016/j.ajo.2019.11.030
M3 - Article
C2 - 31811860
AN - SCOPUS:85077919457
SN - 0002-9394
VL - 211
SP - 191
EP - 199
JO - American journal of ophthalmology
JF - American journal of ophthalmology
ER -