Submit a Manuscript to the Journal

Cogent Engineering

For an Article Collection on

Foundation Models for Generating Pretrained Transformers: Key Technologies and Challenges

Manuscript deadline
30 June 2024

Cover image - Cogent Engineering

Article collection guest advisor(s)

Jenhui Chen, Professor, Dept. Computer Science & Information Engineering, College of Engineering, Chang Gung University
[email protected]

Submit an ArticleVisit JournalArticles

Foundation Models for Generating Pretrained Transformers: Key Technologies and Challenges

Generative Pretrained Transformers (GPTs) have revolutionized the field of artificial intelligence, enabling impressive capabilities in natural language generation, image generation, video generation, and many other domains. This special issue aims to showcase cutting-edge research, applications, and advancements in GPTs. We invite original contributions that span a wide range of topics related to GPTs, including but not limited to text generation, multimodal generation (e.g., text-image, text-video), fine-tuning strategies, ethical considerations, and more.

GPTs are pivotal due to their ability to generate human-like text, images, and more. They facilitate AI advancements in natural language understanding, content creation, and creative applications, significantly impacting industries like NLP, healthcare, entertainment, and more, with potential ethical and societal implications.

Contributions should address, but are not limited to, the following topics:

  • Text/Image/Video Generation: Techniques for improving multimedia content generation capabilities of GPTs, including all types of multimedia generation tasks.
  • Multimodal Generation: Approaches for combining text with other modalities, such as image or video, to generate coherent and meaningful content.
  • Fine-Tuning Strategies: Methods for fine-tuning GPT models for specific tasks or domains.
  • Evaluation Metrics: New metrics and evaluation techniques for assessing the quality and performance of GPT-generated content.
  • Creative Applications: Innovative and creative applications of GPTs in fields like art, literature, music, and more.
  • Explainability and Interpretability: Methods for making GPT-generated content more transparent and interpretable.
  • Scalability and Efficiency: Approaches to scale GPT models and improve their efficiency for practical applications.

All manuscripts submitted to this Article Collection will undergo a full peer-review; the Guest Advisor for this collection will not be handling the manuscripts (unless they are an Editorial Board member). Please review the journal scope and author submission instructions prior to submitting a manuscript.

The deadline for submitting manuscripts is 30 June 2024.

Please contact Zhan Yu at [email protected] with any queries and discount codes regarding this Article Collection.

Please be sure to select "Foundation Models for Generating Pretrained Transformers: Key Technologies and Challenges" from the drop-down menu in the submission system.

Benefits of publishing open access within Taylor & Francis

Global marketing and publicity, ensuring your research reaches the people you want it to.

Article Collections bring together the latest research on hot topics from influential researchers across the globe.

Rigorous peer review for every open access article.

Rapid online publication allowing you to share your work quickly.

All manuscripts submitted to this Article Collection will undergo desk assessment and peer-review as part of our standard editorial process. Guest Advisors for this collection will not be involved in peer-reviewing manuscripts unless they are an existing member of the Editorial Board. Please review the journal Aims and Scope and author submission instructions prior to submitting a manuscript.