A TextGraphs Acceptance from Yanjun: Learning Clause Representation from Dependency-Anchor Graph for Connective Prediction

Clauses are the fundamental text units for complex sentences. A recent paper from Yanjun Gao, " Learning Clause Representation from Dependency-Anchor Graph for Connective Prediction", solves the problem of clause representation and has been accepted to NAACL Workshop on Graph-Based Natural Language Processing (TextGraphs 2021)! In this work, Yanjun proposed a dependency-anchor graph representation that encodes two different syntactic information and highlights the most important constituents, the subject and the verb phrase. A graph-based neural model, DAnCE, is designed to encode this graph representation and generates a "linguistic-aware" clause representation. 


This paper focuses on clause representation for connective predictions. Discourse connective is a critical indicator for text coherence and connects clauses into complex sentences. Thus the study of connectives could facilitate many downstream applications, such as coherence modeling, writing assessments, etc. Previous studies in education show that students show good performance in filling in the correct connectives, but they tend to use connectives freely in their own writing. The paper also presented a newly annotated dataset collected from students' essays, DeSSE, and a modified version of the existing public corpus to train and evaluate the model. Results show that DAnCE achieves competitive performance on both datasets, and indicates a possible application for students writing revision by suggesting the correct connectives.     

The workshop will be hosted virtually on June 11, 2021. Yanjun will release the dataset, codes, and paper at that time. Stay tuned!

Comments

Popular posts from this blog

Fall 2023 NLP lab party!