id author title date pages extension mime words sentences flesch summary cache txt work_fnpfl6ausndjxf26tx3wnlmpl4 Yang Liu Learning Structured Text Representations 2018 15 .pdf application/pdf 7777 892 64 et al., 2017), we propose a model that can encode a document while automatically inducing rich structural dependencies. we embed a differentiable non-projective parsing algorithm into a neural model and use attention mechanisms to incorporate the structural biases. Inspired by existing theories of discourse, representations of document structure have assumed several guises in the literature, such as trees in the style of Rhetorical Structure Theory (RST; Mann and Thompson, 1988), their model learns meaningful task-specific dependency structures, achieving competitive results in between tokens within a sentence, generating a context representation for each word with weak structural information. account while learning representations for both sentences and documents and an attention mechanism We then exploit the structure of T which we induce based on an attention mechanism detailed below to obtain more precise representations. a sentence-level bi-LSTM and applying the proposed structured attention mechanism, we obtain the presents three variants4 of our model, one with structured attention on the sentence level, another one ./cache/work_fnpfl6ausndjxf26tx3wnlmpl4.pdf ./txt/work_fnpfl6ausndjxf26tx3wnlmpl4.txt