Natural Language Processing with Transformers

Natural Language Processing with Transformers: Building Language Applications with Hugging Face

eBook Details:

  • Paperback: 410 pages
  • Publisher: WOW! eBook (February 22, 2022)
  • Language: English
  • ISBN-10: 1098103246
  • ISBN-13: 978-1098103248

eBook Description:

Natural Language Processing with Transformers: Building Language Applications with Hugging Face

Since their introduction in 2017, transformers have quickly become the dominant architecture for achieving state-of-the-art results on a variety of natural language processing tasks. If you’re a data scientist or coder, this practical Natural Language Processing with Transformers book shows you how to train and scale these large models using Hugging Face Transformers, a Python-based deep learning library.

  • Build, debug, and optimize transformer models for core NLP tasks, such as text classification, named entity recognition, and question answering
  • Learn how transformers can be used for cross-lingual transfer learning
  • Apply transformers in real-world scenarios where labeled data is scarce
  • Make transformer models efficient for deployment using techniques such as distillation, pruning, and quantization
  • Train transformers from scratch and learn how to scale to multiple GPUs and distributed environments

Transformers have been used to write realistic news stories, improve Google Search queries, and even create chatbots that tell corny jokes. In this guide, authors Lewis Tunstall, Leandro von Werra, and Thomas Wolf, among the creators of Hugging Face Transformers, use a hands-on approach to teach you how transformers work and how to integrate them in your applications. You’ll quickly learn a variety of tasks they can help you solve.

[ Exclusive Offer! Order Manual Garlic Slicer Now. Get Lowest Price & 60 Day Return Policy. Huge Discounts Available! Bravo Goods Special Offer Expires Soon. ]

DOWNLOAD

Leave a Reply

Your email address will not be published. Required fields are marked *