liu.seSearch for publications in DiVA
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Relation Classification using Semantically-Enhanced Syntactic Dependency Paths: Combining Semantic and Syntactic Dependencies for Relation Classification using Long Short-Term Memory Networks
Linköping University, Department of Computer and Information Science, Human-Centered systems.
2018 (English)Independent thesis Advanced level (degree of Master (Two Years)), 20 credits / 30 HE creditsStudent thesis
Abstract [en]

Many approaches to solving tasks in the field of Natural Language Processing (NLP) use syntactic dependency trees (SDTs) as a feature to represent the latent nonlinear structure within sentences. Recently, work in parsing sentences to graph-based structures which encode semantic relationships between words—called semantic dependency graphs (SDGs)—has gained interest. This thesis seeks to explore the use of SDGs in place of and alongside SDTs within a relation classification system based on long short-term memory (LSTM) neural networks. Two methods for handling the information in these graphs are presented and compared between two SDG formalisms. Three new relation extraction system architectures have been created based on these methods and are compared to a recent state-of-the-art LSTM-based system, showing comparable results when semantic dependencies are used to enhance syntactic dependencies, but with significantly fewer training parameters.

Place, publisher, year, edition, pages
2018. , p. 48
Keywords [en]
Natural language processing, NLP, computational linguistics, syntactic dependency trees, semantic dependency graphs, relation classification, relation extraction, artificial intelligence, machine learning, deep learning, neural networks, long short-term memory, LSTM
National Category
Language Technology (Computational Linguistics)
Identifiers
URN: urn:nbn:se:liu:diva-153877ISRN: LIU-IDA/LITH-EX-A--18/007--SEOAI: oai:DiVA.org:liu-153877DiVA, id: diva2:1279066
Subject / course
Computer science
Presentation
2018-06-08, 15:15 (English)
Supervisors
Examiners
Available from: 2019-01-16 Created: 2019-01-15 Last updated: 2019-01-16Bibliographically approved

Open Access in DiVA

fulltext(515 kB)321 downloads
File information
File name FULLTEXT01.pdfFile size 515 kBChecksum SHA-512
ebc3f452906ca4fe580568dc9412ef7c15a34ee10e02e50555679f0683a141ba007e680773c90ca3edf9b9dd3ec8971b05e04804294324132bad690c6f8ea478
Type fulltextMimetype application/pdf

Search in DiVA

By author/editor
Capshaw, Riley
By organisation
Human-Centered systems
Language Technology (Computational Linguistics)

Search outside of DiVA

GoogleGoogle Scholar
Total: 321 downloads
The number of downloads is the sum of all downloads of full texts. It may include eg previous versions that are now no longer available

urn-nbn

Altmetric score

urn-nbn
Total: 794 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • oxford
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf