Label Efficient Representation Learning for Relation Extraction

Doctoral Researcher
Name Role at KCDS
KCDS Fellow
KCDS Supervisors
Name Role at KCDS
MATH Supervisor, member of the Steering Committee

Abstract

This research project investigates deep learning-based language models as a mechanism for efficiently learning representations conducive to the downstream task of relation extraction. Existing approaches for relation extraction require manual annotation of large amounts of text, rendering them unfeasible for many applications. Our goal is to enable the extraction of structured knowledge from large amounts of text given little to no annotated examples.

Thus, we investigate task realism in benchmarks, evaluate existing methods, and develop new approaches in areas such as few- and zero-shot learning, weak and self-supervision, and data augmentation for relation extraction.