Larger models yield better results? Streamlined severity classification of ADHD-related concerns using BERT-based knowledge distillation.

This work focuses on the efficiency of the knowledge distillation approach in generating a lightweight yet powerful BERT-based model for natural language processing (NLP) applications. After the model creation, we applied the resulting model, LastBERT, to a real-world task-classifying severity level...

Full description

Saved in:
Bibliographic Details
Main Authors: Ahmed Akib Jawad Karim, Kazi Hafiz Md Asad, Md Golam Rabiul Alam
Format: Article
Language:English
Published: Public Library of Science (PLoS) 2025-01-01
Series:PLoS ONE
Online Access:https://doi.org/10.1371/journal.pone.0315829
Tags: Add Tag
No Tags, Be the first to tag this record!