Sinan Ozdemir - Introduction to Transformer Models for NLP

Video Tutorials, Courses

Sinan Ozdemir - Introduction to Transformer Models for NLP
Duration: 10h 13m | Video: 1280x720 30fps | Audio: AAC, 48 kHz, 2ch | Size: 2.53 GB
Genre: eLearning | Language: English
Using BERT, GPT, and More to Solve Modern Natural Language Processing Tasks


Learn how to apply state-of-the-art transformer-based models including BERT and GPT to solve modern NLP tasks.
Overview
Introduction to Transformer Models for NLP LiveLessons provides a comprehensive overview of transformers and the mechanisms—attention, embedding, and tokenization—that set the stage for state-of-the-art NLP models like BERT and GPT to flourish. The focus for these lessons is providing a practical, comprehensive, and functional understanding of transformer architectures and how they are used to create modern NLP pipelines. Throughout this series, instructor Sinan Ozdemir will bring theory to life through illustrations, solved mathematical examples, and straightforward Python examples within Jupyter notebooks.
All lessons in the course are grounded by real-life case studies and hands-on code examples. After completing this lesson, you will be in a great position to understand and build cutting-edge NLP pipelines using transformers. You will also be provided with extensive resources and curriculum detail which can all be found at the course's GitHub repository.
About the Instructor
Sinan Ozdemir'is currently Founder and CTO of Shiba Technologies. Sinan is a former lecturer of Data Science at Johns Hopkins University and the author of multiple textbooks on data science and machine learning. Additionally, he is the founder of the recently acquired Kylie.ai, an enterprise-grade conversational AI platform with RPA capabilities. He holds a master's degree in Pure Mathematics from Johns Hopkins University and is based in San Francisco, CA.
Skill Level
Intermediate
Advanced
Learn How To
Recognize which type of transformer-based model is best for a given task
Understand how transformers process text and make predictions
Fine-tune a transformer-based model
Create pipelines using fine-tuned models
Deploy fine-tuned models and use them in production
Who Should Take This Course
Intermediate/advanced machine learning engineers with experience with ML, neural networks, and NLP
Those interested in state-of-the art NLP architecture
Those interested in productionizing NLP models
Those comfortable using libraries like Tensorflow or PyTorch
Those comfortable with linear algebra and vector/matrix operations
Course Requirements
Python 3 proficiency with some experience working in interactive Python environments including Notebooks (Jupyter/Google Colab/Kaggle Kernels)
Comfortable using the Pandas library and either Tensorflow or PyTorch
Understanding of ML/deep learning fundamentals including train/test splits, loss/cost functions, and gradient descent
https://www.oreilly.com/library/view/introduction-to-transformer/9780137923717/



Download ( Rapidgator )
DOWNLOAD FROM RAPIDGATOR.NET
DOWNLOAD FROM RAPIDGATOR.NET
Download (Uploadgig)
DOWNLOAD FROM UPLOADGIG.COM
DOWNLOAD FROM UPLOADGIG.COM
Download ( NitroFlare )
DOWNLOAD FROM NITROFLARE.COM
DOWNLOAD FROM NITROFLARE.COM

Please Help Me Click Connect Icon Below Here and Share News to Social Network | Thanks you !