id author title date pages extension mime words sentences flesch summary cache txt cord-241351-li476eqy Liu, Junhua CrisisBERT: a Robust Transformer for Crisis Classification and Contextual Crisis Embedding 2020-05-11 .txt text/plain 3860 243 50 However, none of the works perform crisis embedding and classification using state of the art attention-based deep neural networks models, such as Transformers and document-level contextual embeddings. This work proposes CrisisBERT, an end-to-end transformer-based model for two crisis classification tasks, namely crisis detection and crisis recognition, which shows promising results across accuracy and f1 scores. While prior works report remarkable performance on various crisis classification tasks using NN models and word embeddings, no studies are found to leverage the most recent Natural Language Understanding (NLU) techniques, such as attention-based deep classification models [21] and document-level contextual embeddings [22] , which reportedly improve state-of-the-art performance for many challenging natural language problems from upstream tasks such as Named Entity Recognition and Part of Speech Tagging, to downstream tasks such as Machine Translation and Neural Conversation. In this work, we investigate the transformer approach for crisis classification tasks and propose CrisisBERT, a transformer-based classification model that surpasses conventional linear and deep learning models in performance and robustness. ./cache/cord-241351-li476eqy.txt ./txt/cord-241351-li476eqy.txt