liu.seSearch for publications in DiVA
Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Evaluating Pre-Trained Language Models for Focused Terminology Extraction from Swedish Medical Records
Region Östergötland.
RISE Research Institutes of Sweden.
Linköping University, Department of Health, Medicine and Caring Sciences, Division of Diagnostics and Specialist Medicine. Linköping University, Faculty of Medicine and Health Sciences. Region Östergötland, Center for Diagnostics, Medical radiation physics.ORCID iD: 0000-0001-8661-2232
Show others and affiliations
2022 (English)In: Terminology in the 21st Century: Many Faces, Many Places, Term 2022 - held in conjunction with the International Conference on Language Resources and Evaluation, LREC 2022 - Proceedings / [ed] Rute Costa, Sara Carvalho, Ana Ostroški Anić, Anas Fahad Khan, European Language Resources Association , 2022, Vol. 2022.term-1, p. 30-32Conference paper, Published paper (Refereed)
Abstract [en]

In the experiments briefly presented in this abstract, we compare the performance of a generalist Swedish pre-trained languagemodel with a domain-specific Swedish pre-trained model on the downstream task of focused terminology extraction of implantterms, which are terms that indicate the presence of implants in the body of patients. The fine-tuning is identical for bothmodels. For the search strategy we rely on KD-Tree that we feed with two different lists of term seeds, one with noise and onewithout noise. Results shows that the use of a domain-specific pre-trained language model has a positive impact on focusedterminology extraction only when using term seeds without noise.

Place, publisher, year, edition, pages
European Language Resources Association , 2022. Vol. 2022.term-1, p. 30-32
Keywords [en]
terminology extraction, implant terms, generalist BERT, domain-specific BERT
National Category
Language Technology (Computational Linguistics)
Identifiers
URN: urn:nbn:se:liu:diva-190559Scopus ID: 2-s2.0-85146268302ISBN: 9791095546955 (print)OAI: oai:DiVA.org:liu-190559DiVA, id: diva2:1718756
Conference
Language Resources and Evaluation Conference (LREC 2022), Marseille, France, 20-25 June 2022
Available from: 2022-12-13 Created: 2022-12-13 Last updated: 2024-08-23Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

ScopusPaper

Authority records

Lundberg, PeterBjerner, TomasAl-Abasse, YosefJönsson, Arne

Search in DiVA

By author/editor
Jerdhaf, OskarLundberg, PeterBjerner, TomasAl-Abasse, YosefJönsson, Arne
By organisation
Region ÖstergötlandDivision of Diagnostics and Specialist MedicineFaculty of Medicine and Health SciencesMedical radiation physicsHuman-Centered systemsFaculty of Science & Engineering
Language Technology (Computational Linguistics)

Search outside of DiVA

GoogleGoogle Scholar

isbn
urn-nbn

Altmetric score

isbn
urn-nbn
Total: 114 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf