Your privacy, your choice

We use essential cookies to make sure the site can function. We also use optional cookies for advertising, personalisation of content, usage analysis, and social media.

By accepting optional cookies, you consent to the processing of your personal data - including transfers to third parties. Some third parties are outside of the European Economic Area, with varying standards of data protection.

See our privacy policy for more information on the use of your personal data.

for further information and to change your choices.

Skip to main content
Fig 2 | BMC Medical Informatics and Decision Making

Fig 2

From: Exploring the performance and explainability of fine-tuned BERT models for neuroradiology protocol assignment

Fig 2

(Left) Original pre-trained BERT that is trained to perform ‘next sentence prediction (NSP)’ and ‘masked-language modeling (MLM)’. Special classification [CLS] and separator [SEP] tokens are inserted into the input to facilitate learning. (Right) BERT is fine-tuned for this classification task using labeled data from physician entries. The output is a class label corresponding to the assigned protocol

Back to article page