id author title date pages extension mime words sentences flesch summary cache txt cord-025610-7vouj8pp Latif, Seemab Backward-Forward Sequence Generative Network for Multiple Lexical Constraints 2020-05-06 .txt text/plain 3923 230 50 In this paper, we propose a novel neural probabilistic architecture based on backward-forward language model and word embedding substitution method that can cater multiple lexical constraints for generating quality sequences. Recently, Recurrent Neural Networks (RNNs) and their variants such as Long Short Term Memory Networks (LSTMs) and Gated Recurrent Units (GRUs) based language models have shown promising results in generating high quality text sequences, especially when the input and output are of variable length. first proposed multiple variants of Backward and Forward (B/F) language models based on GRUs for constrained sentence generation [13] . Therefore, we have proposed a neural probabilistic Backward-Forward architecture that can generate high quality sequences, with word embedding substitution method to satisfy multiple constraints. In this paper, we have proposed a novel method, dubbed Neural Probabilistic Backward-Forward language model and word embedding substitution method to address the issue of lexical constrained sequence generation. ./cache/cord-025610-7vouj8pp.txt ./txt/cord-025610-7vouj8pp.txt