Transformer-based large language foundation models for text generation: A comprehensive literature review for different languages and application domains
Files and links (2)
Metrics
Details
- Title
- Transformer-based large language foundation models for text generation: A comprehensive literature review for different languages and application domains
- Creators
- Raphael Souza de OliveiraErick Giovani Sperandio Nascimento - University of Surrey, School of Computer Science and Electronic Engineering
- Publication Details
- Information processing & management, Vol.63(2), p.104477
- Publisher
- Elsevier Ltd
- Number of pages
- 30
- First online publication date
- 20/11/2025
- Publication Date
- 03/2026
- Date accepted for publication
- 06/11/2025
- Grant note
- National Council for Scientific and Technological Development (CNPq, Brazil)
The authors thank the Supercomputing Centre for Industrial Innovation (CS2i) from SENAI CIMATEC (Brazil) , as well as the Surrey Institute for People-Centred AI at the University of Surrey (UK) , for their scientific and technical support. We also thank the Regional Labour Court of the 5th Region for investing in research contributing to the scientific community and technological development. Furthermore, we thank the National Council for Scientific and Technological Development (CNPq, Brazil) , for their support. Erick G. Sperandio Nascimento is a CNPq technological development fellow (Proc. 308963/2022-9) . For the purpose of open access, the authors have applied a Creative Commons attribution license (CC BY) to any Author Accepted Manuscript version arising from this submission.
- Identifiers
- 991073428702346; WOS:001626793500001
- Academic Unit
- School of Computer Science and Electronic Engineering
- Language
- English
- Resource Type
- Journal article