id author title date pages extension mime words sentences flesch summary cache txt cord-122344-2lepkvby Hayashi, Hiroaki What's New? Summarizing Contributions in Scientific Literature 2020-11-06 .txt text/plain 7260 383 44 To overcome this problem, we introduce a new task of disentangled paper summarization, which seeks to generate separate summaries for the paper contributions and the context of the work, making it easier to identify the key findings shared in articles. The new task's goal is to generate two summaries simultaneously, one strictly focused on the summarized article's novelties and contributions, the other introducing the context of the work and previous efforts. Recent trends in abstractive text summarization show a shift of focus from designing task-specific architectures trained from scratch (See et al., 2017; Paulus et al., 2018) to leveraging large-scale Transformer-based models pre-trained on vast amounts of data (Liu & Lapata, 2019; Lewis et al., 2020) , often in multi-task settings (Raffel et al., 2019) . In this paper, we propose disentangled paper summarization, a new task in scientific paper summarizing where models simultaneously generate contribution and context summaries. ./cache/cord-122344-2lepkvby.txt ./txt/cord-122344-2lepkvby.txt