Discharge Summary Hospital Course Summarisation of In Patient Electronic
Health Record Text with Clinical Concept Guided Deep Pre-Trained Transformer
Models
- URL: http://arxiv.org/abs/2211.07126v3
- Date: Mon, 10 Apr 2023 16:06:41 GMT
- Title: Discharge Summary Hospital Course Summarisation of In Patient Electronic
Health Record Text with Clinical Concept Guided Deep Pre-Trained Transformer
Models
- Authors: Thomas Searle, Zina Ibrahim, James Teo, Richard Dobson
- Abstract summary: Brief Hospital Course (BHC) summaries are succinct summaries of an entire hospital encounter, embedded within discharge summaries.
We demonstrate a range of methods for BHC summarisation demonstrating the performance of deep learning summarisation models.
- Score: 1.1393603788068778
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Brief Hospital Course (BHC) summaries are succinct summaries of an entire
hospital encounter, embedded within discharge summaries, written by senior
clinicians responsible for the overall care of a patient. Methods to
automatically produce summaries from inpatient documentation would be
invaluable in reducing clinician manual burden of summarising documents under
high time-pressure to admit and discharge patients. Automatically producing
these summaries from the inpatient course, is a complex, multi-document
summarisation task, as source notes are written from various perspectives (e.g.
nursing, doctor, radiology), during the course of the hospitalisation. We
demonstrate a range of methods for BHC summarisation demonstrating the
performance of deep learning summarisation models across extractive and
abstractive summarisation scenarios. We also test a novel ensemble extractive
and abstractive summarisation model that incorporates a medical concept
ontology (SNOMED) as a clinical guidance signal and shows superior performance
in 2 real-world clinical data sets.
Related papers
- Query-Guided Self-Supervised Summarization of Nursing Notes [5.835276312834499]
We introduce QGSumm, a query-guided self-supervised domain adaptation framework for nursing note summarization.
Our approach generates high-quality, patient-centered summaries without relying on reference summaries for training.
arXiv Detail & Related papers (2024-07-04T18:54:30Z) - NOTE: Notable generation Of patient Text summaries through Efficient
approach based on direct preference optimization [0.0]
"NOTE" stands for "Notable generation Of patient Text summaries through an Efficient approach based on direct preference optimization"
Patient events are sequentially combined and used to generate a discharge summary for each hospitalization.
Note can be utilized to generate various summaries not only discharge summaries but also throughout a patient's journey.
arXiv Detail & Related papers (2024-02-19T06:43:25Z) - Generating medically-accurate summaries of patient-provider dialogue: A
multi-stage approach using large language models [6.252236971703546]
An effective summary is required to be coherent and accurately capture all the medically relevant information in the dialogue.
This paper tackles the problem of medical conversation summarization by discretizing the task into several smaller dialogue-understanding tasks.
arXiv Detail & Related papers (2023-05-10T08:48:53Z) - Development and validation of a natural language processing algorithm to
pseudonymize documents in the context of a clinical data warehouse [53.797797404164946]
The study highlights the difficulties faced in sharing tools and resources in this domain.
We annotated a corpus of clinical documents according to 12 types of identifying entities.
We build a hybrid system, merging the results of a deep learning model as well as manual rules.
arXiv Detail & Related papers (2023-03-23T17:17:46Z) - SPeC: A Soft Prompt-Based Calibration on Performance Variability of
Large Language Model in Clinical Notes Summarization [50.01382938451978]
We introduce a model-agnostic pipeline that employs soft prompts to diminish variance while preserving the advantages of prompt-based summarization.
Experimental findings indicate that our method not only bolsters performance but also effectively curbs variance for various language models.
arXiv Detail & Related papers (2023-03-23T04:47:46Z) - Retrieval-Augmented and Knowledge-Grounded Language Models for Faithful Clinical Medicine [68.7814360102644]
We propose the Re$3$Writer method with retrieval-augmented generation and knowledge-grounded reasoning.
We demonstrate the effectiveness of our method in generating patient discharge instructions.
arXiv Detail & Related papers (2022-10-23T16:34:39Z) - Self-supervised Answer Retrieval on Clinical Notes [68.87777592015402]
We introduce CAPR, a rule-based self-supervision objective for training Transformer language models for domain-specific passage matching.
We apply our objective in four Transformer-based architectures: Contextual Document Vectors, Bi-, Poly- and Cross-encoders.
We report that CAPR outperforms strong baselines in the retrieval of domain-specific passages and effectively generalizes across rule-based and human-labeled passages.
arXiv Detail & Related papers (2021-08-02T10:42:52Z) - What's in a Summary? Laying the Groundwork for Advances in
Hospital-Course Summarization [2.432409923443071]
Given the documentation authored throughout a patient's hospitalization, generate a paragraph that tells the story of the patient admission.
We construct an English, text-to-text dataset of 109,000 hospitalizations (2M source notes) and their corresponding summary proxy: the clinician-authored "Brief Hospital Course"
Our analysis identifies multiple implications for modeling this complex, multi-document summarization task.
arXiv Detail & Related papers (2021-04-12T19:31:48Z) - Clinical Outcome Prediction from Admission Notes using Self-Supervised
Knowledge Integration [55.88616573143478]
Outcome prediction from clinical text can prevent doctors from overlooking possible risks.
Diagnoses at discharge, procedures performed, in-hospital mortality and length-of-stay prediction are four common outcome prediction targets.
We propose clinical outcome pre-training to integrate knowledge about patient outcomes from multiple public sources.
arXiv Detail & Related papers (2021-02-08T10:26:44Z) - Generating SOAP Notes from Doctor-Patient Conversations Using Modular
Summarization Techniques [43.13248746968624]
We introduce the first complete pipelines to leverage deep summarization models to generate SOAP notes.
We propose Cluster2Sent, an algorithm that extracts important utterances relevant to each summary section.
Our results speak to the benefits of structuring summaries into sections and annotating supporting evidence when constructing summarization corpora.
arXiv Detail & Related papers (2020-05-04T19:10:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.