fermi-bert-1024 is a pre-trained BERT model specifically optimized for the nuclear energy field. This model is trained on data from Wikipedia, Books3, and the US Nuclear Regulatory Commission's ADAMS database, and it is specially optimized for handling technical terms and regulatory languages in the nuclear energy industry. It was trained on the Frontier supercomputer at Oak Ridge National Laboratory using 128 AMD MI250X GPUs for 10 hours, providing a solid foundation for nuclear energy applications.
Natural Language Processing
TransformersEnglish