Mastering Transformers : The Journey from BERT to Large Language Models and Stable Diffusion

Enregistré dans:
Détails bibliographiques
Auteur principal: Yildirim, Savas. (Auteur)
Autres auteurs: Chenaghlu, Meysam Asgari-. (Auteur)
Support: E-Book
Langue: Anglais
Publié: Birmingham : Packt Publishing.
Autres localisations: Voir dans le Sudoc
Résumé: Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectivelyKey FeaturesUnderstand the complexity of deep learning architecture and transformers architectureCreate solutions to industrial natural language processing (NLP) and computer vision (CV) problemsExplore challenges in the preparation process, such as problem and language-specific dataset transformationPurchase of the print or Kindle book includes a free PDF eBookBook DescriptionTransformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems. Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting. By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.What you will learnFocus on solving simple-to-complex NLP problems with PythonDiscover how to solve classification/regression problems with traditional NLP approachesTrain a language model and explore how to fine-tune models to the downstream tasksUnderstand how to use transformers for generative AI and computer vision tasksBuild transformer-based NLP apps with the Python transformers libraryFocus on language generation such as machine translation and conversational AI in any languageSpeed up transformer model inference to reduce latencyWho this book is forThis book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required
Accès en ligne: Accès à l'E-book
LEADER 04917nmm a2200577 i 4500
001 ebook-280315171
005 20240917153300.0
007 cu|uuu---uuuuu
008 240917s2024||||uk ||||g|||| ||||||eng d
020 |a 9781837631506 
035 |a (OCoLC)1456999098 
035 |a FRCYB88957623 
035 |a FRCYB26088957623 
035 |a FRCYB14088957623 
035 |a FRCYB19188957623 
035 |a FRCYB24288957623 
035 |a FRCYB26888957623 
035 |a FRCYB27488957623 
035 |a FRCYB24788957623 
035 |a FRCYB24888957623 
035 |a FRCYB29388957623 
035 |a FRCYB29588957623 
035 |a FRCYB55488957623 
035 |a FRCYB55988957623 
035 |a FRCYB084688957623 
035 |a FRCYB085688957623 
035 |a FRCYB087588957623 
035 |a FRCYB087888957623 
035 |a FRCYB56788957623 
035 |a FRCYB095788957623 
035 |a FRCYB097088957623 
035 |a FRCYB087088957623 
040 |a ABES  |b fre  |e AFNOR 
041 0 |a eng  |2 639-2 
100 1 |a Yildirim, Savas.  |4 aut.  |e Auteur 
245 1 0 |a Mastering Transformers :  |b The Journey from BERT to Large Language Models and Stable Diffusion   |c Savas Yildirim, Meysam Asgari- Chenaghlu. 
264 1 |a Birmingham :  |b Packt Publishing. 
264 2 |a Paris :  |b Cyberlibris,  |c 2024. 
336 |b txt  |2 rdacontent 
337 |b c  |2 rdamedia 
337 |b b  |2 isbdmedia 
338 |b ceb  |2 RDAfrCarrier 
500 |a Couverture (https://static2.cyberlibris.com/books_upload/136pix/9781837631506.jpg). 
506 |a L'accès en ligne est réservé aux établissements ou bibliothèques ayant souscrit l'abonnement  |e Cyberlibris 
520 |a Explore transformer-based language models from BERT to GPT, delving into NLP and computer vision tasks, while tackling challenges effectivelyKey FeaturesUnderstand the complexity of deep learning architecture and transformers architectureCreate solutions to industrial natural language processing (NLP) and computer vision (CV) problemsExplore challenges in the preparation process, such as problem and language-specific dataset transformationPurchase of the print or Kindle book includes a free PDF eBookBook DescriptionTransformer-based language models such as BERT, T5, GPT, DALL-E, and ChatGPT have dominated NLP studies and become a new paradigm. Thanks to their accurate and fast fine-tuning capabilities, transformer-based language models have been able to outperform traditional machine learning-based approaches for many challenging natural language understanding (NLU) problems. Aside from NLP, a fast-growing area in multimodal learning and generative AI has recently been established, showing promising results. Mastering Transformers will help you understand and implement multimodal solutions, including text-to-image. Computer vision solutions that are based on transformers are also explained in the book. You'll get started by understanding various transformer models before learning how to train different autoregressive language models such as GPT and XLNet. The book will also get you up to speed with boosting model performance, as well as tracking model training using the TensorBoard toolkit. In the later chapters, you'll focus on using vision transformers to solve computer vision problems. Finally, you'll discover how to harness the power of transformers to model time series data and for predicting. By the end of this transformers book, you'll have an understanding of transformer models and how to use them to solve challenges in NLP and CV.What you will learnFocus on solving simple-to-complex NLP problems with PythonDiscover how to solve classification/regression problems with traditional NLP approachesTrain a language model and explore how to fine-tune models to the downstream tasksUnderstand how to use transformers for generative AI and computer vision tasksBuild transformer-based NLP apps with the Python transformers libraryFocus on language generation such as machine translation and conversational AI in any languageSpeed up transformer model inference to reduce latencyWho this book is forThis book is for deep learning researchers, hands-on practitioners, and ML/NLP researchers. Educators, as well as students who have a good command of programming subjects, knowledge in the field of machine learning and artificial intelligence, and who want to develop apps in the field of NLP as well as multimodal tasks will also benefit from this book's hands-on approach. Knowledge of Python (or any programming language) and machine learning literature, as well as a basic understanding of computer science, are required 
700 1 |a Chenaghlu, Meysam Asgari-.  |4 aut.  |e Auteur 
856 |q HTML  |u https://srvext.uco.fr/login?url=https://univ.scholarvox.com/book/88957623  |w Données éditeur  |z Accès à l'E-book 
886 2 |2 unimarc  |a 181  |a i#  |b xxxe## 
993 |a E-Book  
994 |a BNUM 
995 |a 280315171