Natural Language Processing with Transformers in Python

Video Tutorials, Courses

Natural Language Processing with Transformers in Python
Duration: 11h 23m | Video: .MP4, 1280x720 30 fps | Audio: AAC, 44.1 kHz, 2ch | Size: 3.28 GB
Genre: eLearning | Language: English

Learn next-generation NLP with transformers using PyTorch, TensorFlow, and HuggingFace!


What you'll learn

How to use transformer models for NLP

Modern natural language processing technologies

An overview of recent development in NLP

Python

Machine Learning

Natural Language Processing

Tensorflow

PyTorch

Transformers

Sennt Analysis

Question and Answering

Named Entity Recognition

Requirements

Knowledge of Python

Experience with data science a plus

Experience with NLP a plus

Description

Transformer models are the de-facto standard in modern NLP. They have proven themselves as the most expressive, powerful models for language by a large ma, beating all major language-based benchmarks and again.

In this course, we learn all you need to know to get started with building cutting-edge performance NLP applications using transformer models like Google AI's BERT, or Facebook AI's DPR.

We cover several key NLP frameworks including:

HuggingFace's Transformers

TensorFlow 2

PyTorch

spaCy

NLTK

Flair

And learn how to apply transformers to some of the most popular NLP use-cases:

Language classification/sennt analysis

Named entity recognition (NER)

Question and Answering

Similarity/comparative learning

Throughout each of these use-cases we work through a variety of examples to ensure that what, how, and why transformers are so important. Alongside these sections we also work through two full-size NLP projects, one for sennt analysis of financial Reddit data, and another covering a fully-fledged open domain question-answering application.

All of this is supported by several other sections that encourage us to learn how to better design, implement, and measure the performance of our models, such as:

History of NLP and where transformers come from

Common preprocessing techniques for NLP

The theory behind transformers

How to fine-tune transformers

We cover all this and more, I look forward to seeing you in the course!

Who this course is for:

Aspiring data scientists and ML eeers interested in NLP

Practitioners looking to upgrade their skills

Developers looking to implement NLP solutions

Data scientist

Machine Learning Eeer

Python Developers



DOWNLOAD
uploadgig


rapidgator


nitro