Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data.
A medical specialty prediction system for remote diagnosis can reduce the unexpected costs incurred by first-visit patients who visit the wrong hospital department for their symptoms. To develop medical specialty prediction systems, several researchers have explored clinical predictive models using...
Saved in:
Main Authors: | , , , , , , , , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2025-01-01
|
Series: | PLoS ONE |
Online Access: | https://doi.org/10.1371/journal.pone.0317795 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
_version_ | 1825206805361328128 |
---|---|
author | Soyeon Lee Ye Ji Han Hyun Joon Park Byung Hoon Lee DaHee Son SoYeon Kim HyeonJong Yang TaeJun Han EunSun Kim Sung Won Han |
author_facet | Soyeon Lee Ye Ji Han Hyun Joon Park Byung Hoon Lee DaHee Son SoYeon Kim HyeonJong Yang TaeJun Han EunSun Kim Sung Won Han |
author_sort | Soyeon Lee |
collection | DOAJ |
description | A medical specialty prediction system for remote diagnosis can reduce the unexpected costs incurred by first-visit patients who visit the wrong hospital department for their symptoms. To develop medical specialty prediction systems, several researchers have explored clinical predictive models using real medical text data. Medical text data include large amounts of information regarding patients, which increases the sequence length. Hence, a few studies have attempted to extract entities from the text as concise features and provide domain-specific knowledge for clinical text classification. However, it is still insufficient to inject them into the model effectively. Thus, we propose Entity-enhanced BERT (E-BERT), which utilizes the structural attributes of BERT for medical specialty prediction. E-BERT has an entity embedding layer and entity-aware attention to inject domain-specific knowledge and focus on relationships between medical-related entities within the sequences. Experimental results on clinical questionnaire data demonstrate the superiority of E-BERT over the other benchmark models, regardless of the input sequence length. Moreover, the visualization results for the effects of entity-aware attention prove that E-BERT effectively incorporate domain-specific knowledge and other information, enabling the capture of contextual information in the text. Finally, the robustness and applicability of the proposed method is explored by applying it to other Pre-trained Language Models. These effective medical specialty predictive model can provide practical information to first-visit patients, resulting in streamlining the diagnostic process and improving the quality of medical consultations. |
format | Article |
id | doaj-art-96b9ae3c89ac4e589d49bf96488710ee |
institution | Kabale University |
issn | 1932-6203 |
language | English |
publishDate | 2025-01-01 |
publisher | Public Library of Science (PLoS) |
record_format | Article |
series | PLoS ONE |
spelling | doaj-art-96b9ae3c89ac4e589d49bf96488710ee2025-02-07T05:30:48ZengPublic Library of Science (PLoS)PLoS ONE1932-62032025-01-01201e031779510.1371/journal.pone.0317795Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data.Soyeon LeeYe Ji HanHyun Joon ParkByung Hoon LeeDaHee SonSoYeon KimHyeonJong YangTaeJun HanEunSun KimSung Won HanA medical specialty prediction system for remote diagnosis can reduce the unexpected costs incurred by first-visit patients who visit the wrong hospital department for their symptoms. To develop medical specialty prediction systems, several researchers have explored clinical predictive models using real medical text data. Medical text data include large amounts of information regarding patients, which increases the sequence length. Hence, a few studies have attempted to extract entities from the text as concise features and provide domain-specific knowledge for clinical text classification. However, it is still insufficient to inject them into the model effectively. Thus, we propose Entity-enhanced BERT (E-BERT), which utilizes the structural attributes of BERT for medical specialty prediction. E-BERT has an entity embedding layer and entity-aware attention to inject domain-specific knowledge and focus on relationships between medical-related entities within the sequences. Experimental results on clinical questionnaire data demonstrate the superiority of E-BERT over the other benchmark models, regardless of the input sequence length. Moreover, the visualization results for the effects of entity-aware attention prove that E-BERT effectively incorporate domain-specific knowledge and other information, enabling the capture of contextual information in the text. Finally, the robustness and applicability of the proposed method is explored by applying it to other Pre-trained Language Models. These effective medical specialty predictive model can provide practical information to first-visit patients, resulting in streamlining the diagnostic process and improving the quality of medical consultations.https://doi.org/10.1371/journal.pone.0317795 |
spellingShingle | Soyeon Lee Ye Ji Han Hyun Joon Park Byung Hoon Lee DaHee Son SoYeon Kim HyeonJong Yang TaeJun Han EunSun Kim Sung Won Han Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data. PLoS ONE |
title | Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data. |
title_full | Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data. |
title_fullStr | Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data. |
title_full_unstemmed | Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data. |
title_short | Entity-enhanced BERT for medical specialty prediction based on clinical questionnaire data. |
title_sort | entity enhanced bert for medical specialty prediction based on clinical questionnaire data |
url | https://doi.org/10.1371/journal.pone.0317795 |
work_keys_str_mv | AT soyeonlee entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT yejihan entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT hyunjoonpark entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT byunghoonlee entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT daheeson entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT soyeonkim entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT hyeonjongyang entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT taejunhan entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT eunsunkim entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata AT sungwonhan entityenhancedbertformedicalspecialtypredictionbasedonclinicalquestionnairedata |