liu.seSearch for publications in DiVA
Endre søk
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf
Evaluating Pre-Trained Language Models for Focused Terminology Extraction from Swedish Medical Records
Region Östergötland.
RISE Research Institutes of Sweden.
Linköpings universitet, Institutionen för hälsa, medicin och vård, Avdelningen för diagnostik och specialistmedicin. Linköpings universitet, Medicinska fakulteten. Region Östergötland, Diagnostikcentrum, Medicinsk strålningsfysik.ORCID-id: 0000-0001-8661-2232
Vise andre og tillknytning
2022 (engelsk)Inngår i: Proceedings of the Workshop on Terminology in the 21st century: many faces, many places / [ed] Rute Costa, Sara Carvalho, Ana Ostroški Anić, Anas Fahad Khan, European Language Resources Association , 2022, Vol. 2022.term-1, s. 30-32Konferansepaper, Publicerat paper (Fagfellevurdert)
Abstract [en]

In the experiments briefly presented in this abstract, we compare the performance of a generalist Swedish pre-trained languagemodel with a domain-specific Swedish pre-trained model on the downstream task of focused terminology extraction of implantterms, which are terms that indicate the presence of implants in the body of patients. The fine-tuning is identical for bothmodels. For the search strategy we rely on KD-Tree that we feed with two different lists of term seeds, one with noise and onewithout noise. Results shows that the use of a domain-specific pre-trained language model has a positive impact on focusedterminology extraction only when using term seeds without noise.

sted, utgiver, år, opplag, sider
European Language Resources Association , 2022. Vol. 2022.term-1, s. 30-32
Emneord [en]
terminology extraction, implant terms, generalist BERT, domain-specific BERT
HSV kategori
Identifikatorer
URN: urn:nbn:se:liu:diva-190559OAI: oai:DiVA.org:liu-190559DiVA, id: diva2:1718756
Konferanse
Language Resources and Evaluation Conference (LREC 2022), Marseille, France, 20-25 June 2022
Tilgjengelig fra: 2022-12-13 Laget: 2022-12-13 Sist oppdatert: 2024-02-07bibliografisk kontrollert

Open Access i DiVA

Fulltekst mangler i DiVA

Andre lenker

Paper

Person

Lundberg, PeterBjerner, TomasAl-Abasse, YosefJönsson, Arne

Søk i DiVA

Av forfatter/redaktør
Jerdhaf, OskarLundberg, PeterBjerner, TomasAl-Abasse, YosefJönsson, Arne
Av organisasjonen

Søk utenfor DiVA

GoogleGoogle Scholar

urn-nbn

Altmetric

urn-nbn
Totalt: 84 treff
RefereraExporteraLink to record
Permanent link

Direct link
Referera
Referensformat
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Annet format
Fler format
Språk
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Annet språk
Fler språk
Utmatningsformat
  • html
  • text
  • asciidoc
  • rtf