Larger models yield better results? Streamlined severity classification of ADHD-related concerns using BERT-based knowledge distillation.
This work focuses on the efficiency of the knowledge distillation approach in generating a lightweight yet powerful BERT-based model for natural language processing (NLP) applications. After the model creation, we applied the resulting model, LastBERT, to a real-world task-classifying severity level...
Saved in:
Main Authors: | , , |
---|---|
Format: | Article |
Language: | English |
Published: |
Public Library of Science (PLoS)
2025-01-01
|
Series: | PLoS ONE |
Online Access: | https://doi.org/10.1371/journal.pone.0315829 |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Be the first to leave a comment!