Mastering Transformers:The Jouey from BERT to Large Language Models and Stable Diffusion
by: Savaş Yıldırım (Author),Meysam Asgari-Chenaghlu(Author)
Publisher: Packt Publishing
Edition: 2nd ed.
Publication Date: 2024/6/3
Language: English
Print Length: 462 pages
ISBN-10: 1837633789
ISBN-13: 9781837633784
Book Description
Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key FeaturesUnderstand the complexity of deep leaing architecture and transformers architectureCreate solutions to industrial natural language processing (NLP) and computer vision (CV) problemsExplore challenges in the preparation process, such as problem and language-specific dataset transformationPurchase of the print or Kindle book includes a free PDF eBookBook DescriptionTransformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine leaing-based approaches for many challenging natural language understanding (NLU) problems.Aside from NLP, a fast-growing area in multimodal leaing and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You’ll get started by understanding various transformer models before leaing how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you’ll focus on using vision transformers to solve computer vision problems. Finally, you’ll discover how to haess the power of transformers to model time series data and for predicting.By the end of this transformers book, you’ll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.What you will leaFocus on solving simple-to-complex NLP problems with PythonDiscover how to solve classification/regression problems with traditional NLP approachesTrain a language model and explore how to fine-tune models to the downstream tasksUnderstand how to use transformers for generative AI and computer vision tasksBuild transformer-based NLP apps with the Python transformers libraryFocus on language generation such as machine translation and conversational AI in any languageSpeed up transformer model inference to reduce latencyWho this book is forThis book is for deep leaing researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine leaing and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book’s hands-on approach. Knowledge of Python (or any programming language) and machine leaing literature, as well as a basic understanding of computer science, are required.Table of ContentsFrom Bag-of-Words to the TransformerA Hands-On Introduction to the SubjectAutoencoding Language ModelsAutoregressive Language ModelsFine-Tuning Language Model for Text ClassificationFine-Tuning Language Models for Token ClassificationText RepresentationBoosting Your Model PerformanceParameter Efficient Fine-TuningZero-Shot and Few-Shot Leaing in NLPExplainable AI (XAI) for NLPWorking with Efficient TransformersCross-Lingual Language ModelingServing Transformer Models(N.B. Please use the Read Sample option to see further chapters)
About the Author
Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectively Key FeaturesUnderstand the complexity of deep leaing architecture and transformers architectureCreate solutions to industrial natural language processing (NLP) and computer vision (CV) problemsExplore challenges in the preparation process, such as problem and language-specific dataset transformationPurchase of the print or Kindle book includes a free PDF eBookBook DescriptionTransformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine leaing-based approaches for many challenging natural language understanding (NLU) problems.Aside from NLP, a fast-growing area in multimodal leaing and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You’ll get started by understanding various transformer models before leaing how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you’ll focus on using vision transformers to solve computer vision problems. Finally, you’ll discover how to haess the power of transformers to model time series data and for predicting.By the end of this transformers book, you’ll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.What you will leaFocus on solving simple-to-complex NLP problems with PythonDiscover how to solve classification/regression problems with traditional NLP approachesTrain a language model and explore how to fine-tune models to the downstream tasksUnderstand how to use transformers for generative AI and computer vision tasksBuild transformer-based NLP apps with the Python transformers libraryFocus on language generation such as machine translation and conversational AI in any languageSpeed up transformer model inference to reduce latencyWho this book is forThis book is for deep leaing researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine leaing and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book’s hands-on approach. Knowledge of Python (or any programming language) and machine leaing literature, as well as a basic understanding of computer science, are required.Table of ContentsFrom Bag-of-Words to the TransformerA Hands-On Introduction to the SubjectAutoencoding Language ModelsAutoregressive Language ModelsFine-Tuning Language Model for Text ClassificationFine-Tuning Language Models for Token ClassificationText RepresentationBoosting Your Model PerformanceParameter Efficient Fine-TuningZero-Shot and Few-Shot Leaing in NLPExplainable AI (XAI) for NLPWorking with Efficient TransformersCross-Lingual Language ModelingServing Transformer Models(N.B. Please use the Read Sample option to see further chapters)
未经允许不得转载:Wow! eBook » Mastering Transformers:The Jouey from BERT to Large Language Models and Stable Diffusion