Online Program Home
  My Program

All Times EDT

Abstract Details

Activity Number: 433 - Statistical Approaches in Text Analysis
Type: Topic-Contributed
Date/Time: Thursday, August 12, 2021 : 4:00 PM to 5:50 PM
Sponsor: Text Analysis Interest Group
Abstract #317199
Title: NukeLM: Pre-Trained and Fine-Tuned Language Models for the Nuclear and Energy Domains
Author(s): Daniel Fortin* and Lee Burke and Karl Pazdernik and Benjamin Wilson and Rustam Goychayev and John Mattingly
Companies: Pacific Northwest National Laboratory and Pacific Northwest National Laboratory and Pacific Northwest National Laboratory and Pacific Northwest National Laboratory and Pacific Northwest National Laboratory and North Carolina State University
Keywords: Transformer; BERT; Natural Language Processing; nuclear nonproliferation
Abstract:

Natural language processing (NLP) tasks have seen amazing improvements over the last few years. This is due to language models such as BERT that achieve deep knowledge transfer by using a large pre-trained model, then fine-tuning the model on specific tasks. The BERT architecture has shown even better performance on domain-specific tasks when the model is pre-trained using domain-relevant texts. Inspired by these recent advancements, we have developed NukeLM, a nuclear-domain language model pre-trained on 1.5 million abstracts from the U.S. Department of Energy Office of Scientific and Technical Information (OSTI) database. This NukeLM model is then fine-tuned for the classification of research articles into either binary classes (related to the nuclear fuel cycle (NFC) or not) or multiple categories related to the subject of the article. We show that continued pre-training of a BERT-style architecture prior to fine-tuning yields greater performance in both article classification tasks. This information is critical for properly triaging manuscripts, and uncovering new areas of research in the nuclear (or nuclear-relevant) domains.


Authors who are presenting talks have a * after their name.

Back to the full JSM 2021 program