Building Transformer Models with PyTorch 2.0 : NLP, computer vision, and speech processing with PyTorch and Hugging Face

Enregistré dans:
Détails bibliographiques
Auteur principal: Timsina, Prem. (Auteur)
Support: E-Book
Langue: Anglais
Publié: New Delhi : BPB Publications.
Autres localisations: Voir dans le Sudoc
Résumé: Your key to transformer based NLP, vision, speech, and multimodalitiesKey Features? Transformer architecture for different modalities and multimodalities. Practical guidelines to build and fine-tune transformer models. Comprehensive code samples with detailed documentation.DescriptionThis book covers transformer architecture for various applications including NLP, computer vision, speech processing, and predictive modeling with tabular data. It is a valuable resource for anyone looking to harness the power of transformer architecture in their machine learning projects.The book provides a step-by-step guide to building transformer models from scratch and fine-tuning pre-trained open-source models. It explores foundational model architecture, including GPT, VIT, Whisper, TabTransformer, Stable Diffusion, and the core principles for solving various problems with transformers. The book also covers transfer learning, model training, and fine-tuning, and discusses how to utilize recent models from Hugging Face. Additionally, the book explores advanced topics such as model benchmarking, multimodal learning, reinforcement learning, and deploying and serving transformer models.In conclusion, this book offers a comprehensive and thorough guide to transformer models and their various applications.What you will learn? Understand the core architecture of various foundational models, including single and multimodalities. Step-by-step approach to developing transformer-based Machine Learning models. Utilize various open-source models to solve your business problems. Train and fine-tune various open-source models using PyTorch 2.0 and the Hugging Face ecosystem. Deploy and serve transformer models. Best practices and guidelines for building transformer-based models.Who this book is forThis book caters to data scientists, Machine Learning engineers, developers, and software architects interested in the world of generative AI
Accès en ligne: Accès à l'E-book
LEADER 04372cmm a2200469 i 4500
001 ebook-280310536
005 20240924015206.0
007 cu|uuu---uuuuu
008 240917s2024||||xx#||||g|||| ||||||eng d
035 |a (OCoLC)1436535542 
035 |a FRCYB88955333 
035 |a FRCYB26088955333 
035 |a FRCYB19188955333 
035 |a FRCYB24788955333 
035 |a FRCYB24888955333 
035 |a FRCYB29388955333 
035 |a FRCYB084688955333 
035 |a FRCYB087588955333 
035 |a FRCYB56788955333 
035 |a FRCYB63288955333 
035 |a FRCYB097088955333 
035 |a FRCYB101388955333 
035 |a FRCYB087088955333 
040 |a ABES  |b fre  |e AFNOR 
041 0 |a eng  |2 639-2 
100 1 |a Timsina, Prem.  |4 aut.  |e Auteur 
245 1 0 |a Building Transformer Models with PyTorch 2.0 :  |b NLP, computer vision, and speech processing with PyTorch and Hugging Face   |c Prem Timsina. 
264 1 |a New Delhi :  |b BPB Publications. 
264 2 |a Paris :  |b Cyberlibris,  |c 2024. 
336 |b txt  |2 rdacontent 
337 |b c  |2 rdamedia 
337 |b b  |2 isbdmedia 
338 |b ceb  |2 RDAfrCarrier 
500 |a Couverture (https://static2.cyberlibris.com/books_upload/136pix/9789355517494.jpg). 
506 |a L'accès en ligne est réservé aux établissements ou bibliothèques ayant souscrit l'abonnement  |e Cyberlibris 
520 |a Your key to transformer based NLP, vision, speech, and multimodalitiesKey Features? Transformer architecture for different modalities and multimodalities. Practical guidelines to build and fine-tune transformer models. Comprehensive code samples with detailed documentation.DescriptionThis book covers transformer architecture for various applications including NLP, computer vision, speech processing, and predictive modeling with tabular data. It is a valuable resource for anyone looking to harness the power of transformer architecture in their machine learning projects.The book provides a step-by-step guide to building transformer models from scratch and fine-tuning pre-trained open-source models. It explores foundational model architecture, including GPT, VIT, Whisper, TabTransformer, Stable Diffusion, and the core principles for solving various problems with transformers. The book also covers transfer learning, model training, and fine-tuning, and discusses how to utilize recent models from Hugging Face. Additionally, the book explores advanced topics such as model benchmarking, multimodal learning, reinforcement learning, and deploying and serving transformer models.In conclusion, this book offers a comprehensive and thorough guide to transformer models and their various applications.What you will learn? Understand the core architecture of various foundational models, including single and multimodalities. Step-by-step approach to developing transformer-based Machine Learning models. Utilize various open-source models to solve your business problems. Train and fine-tune various open-source models using PyTorch 2.0 and the Hugging Face ecosystem. Deploy and serve transformer models. Best practices and guidelines for building transformer-based models.Who this book is forThis book caters to data scientists, Machine Learning engineers, developers, and software architects interested in the world of generative AI 
559 2 |b 1. Transformer Architecture  |b 2. Hugging Face Ecosystem  |b 3. Transformer Model in PyTorch  |b 4. Transfer Learning with PyTorch and Hugging Face  |b 5. Large Language Models: BERT, GPT-3, and BART  |b 6. NLP Tasks with Transformers  |b 7. CV Model Anatomy: ViT, DETR, and DeiT  |b 8. Computer Vision Tasks with Transformers  |b 9. Speech Processing Model Anatomy: Whisper, SpeechT5, and Wav2Vec  |b 10. Speech Tasks with Transformers  |b 11. Transformer Architecture for Tabular Data Processing  |b 12. Transformers for Tabular Data Regression and Classification  |b 13. Multimodal Transformers, Architectures and Applications  |b 14. Explore Reinforcement Learning for Transformer  |b 15. Model Export, Serving, and Deployment  |b 16. Transformer Model Interpretability, and Experimental Visualization  |b 17. PyTorch Models: Best Practices and Debugging 
856 |q HTML  |u https://srvext.uco.fr/login?url=https://univ.scholarvox.com/book/88955333  |w Données éditeur  |z Accès à l'E-book 
886 2 |2 unimarc  |a 181  |a i#  |b xxxe## 
993 |a E-Book  
994 |a BNUM 
995 |a 280310536