Getting Started with Google BERT

Getting Started with Google BERT

eBook Details:

  • Paperback: 352 pages
  • Publisher: WOW! eBook (February 9, 2021)
  • Language: English
  • ISBN-10: 1838821597
  • ISBN-13: 978-1838821593

eBook Description:

Getting Started with Google BERT: Kickstart your NLP journey by exploring BERT and its variants such as ALBERT, RoBERTa, DistilBERT, VideoBERT, and more with Hugging Face’s transformers library

BERT (bidirectional encoder representations from transformer) has revolutionized the world of natural language processing (NLP) with promising results. This book is an introductory guide that will help you get to grips with Google’s BERT architecture. With a detailed explanation of the transformer architecture, this book will help you understand how the transformer’s encoder and decoder work.

You’ll explore the BERT architecture by learning how the BERT model is pre-trained and how to use pre-trained BERT for downstream tasks by fine-tuning it for NLP tasks such as sentiment analysis and text summarization with the Hugging Face transformers library. As you advance, you’ll learn about different variants of BERT such as ALBERT, RoBERTa, and ELECTRA, and look at SpanBERT, which is used for NLP tasks like question answering. You’ll also cover simpler and faster BERT variants based on knowledge distillation such as DistilBERT and TinyBERT. The book takes you through MBERT, XLM, and XLM-R in detail and then introduces you to sentence-BERT, which is used for obtaining sentence representation. Finally, you’ll discover domain-specific BERT models such as BioBERT and ClinicalBERT, and discover an interesting variant called VideoBERT.

  • Understand the transformer model from the ground up
  • Find out how BERT works and pre-train it using masked language model (MLM) and next sentence prediction (NSP) tasks
  • Get hands-on with BERT by learning to generate contextual word and sentence embeddings
  • Fine-tune BERT for downstream tasks
  • Get to grips with ALBERT, RoBERTa, ELECTRA, and SpanBERT models
  • Get the hang of the BERT models based on knowledge distillation
  • Understand cross-lingual models such as XLM and XLM-R
  • Explore Sentence-BERT, VideoBERT, and BART

By the end of this Getting Started with Google BERT book, you’ll be well-versed with using BERT and its variants for performing practical NLP tasks.

DOWNLOAD

5 Responses

  1. April 25, 2021

    […] hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) […]

  2. April 27, 2021

    […] trains you in three stages. The first stage introduces you to Transformer architectures, including RoBERTa, BERT, and DistilBERT Transformers with Hugging Face. You will discover training methods for smaller Transformers that […]

  3. April 27, 2021

    […] trains you in three stages. The first stage introduces you to Transformer architectures, including RoBERTa, BERT, and DistilBERT Transformers with Hugging Face. You will discover training methods for smaller Transformers that […]

  4. September 7, 2021

    […] hands-on knowledge of how BERT (Bidirectional Encoder Representations from Transformers) can be used to develop question answering (QA) systems by using natural language processing (NLP) […]

  5. September 10, 2021

    […] trains you in three stages. The first stage introduces you to Transformer architectures, including RoBERTa, BERT, and DistilBERT Transformers with Hugging Face. You will discover training methods for smaller Transformers that […]

Leave a Reply

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.