Advertisements   
   

Deep Learning for NLP – Part 5

Advertisements   
   

This course is a part of “Deep Learning for NLP” Series. In this course, I will talk about various design schemes for efficient Transformer models. These techniques will come in very handy for academic as well as industry participants. For industry use cases, Transformer models have been shown to lead to very high accuracy values across many NLP tasks. But they have quadratic memory as well as computational complexity making it very difficult to ship them. Thus, this course which focuses on methods to make Transformers efficient is very critical for anyone who wants to ship Transformer models as part of their products.

Time and activation memory in Transformers grows quadratically with the sequence length. This is because in every layer, every attention head attempts to come up with a transformed representation for every position by “paying attention” to tokens at every other position. Quadratic complexity implies that practically the maximum input size is rather limited. Thus, we cannot extract semantic representation for long documents by passing them as input to Transformers. Hence, in this module we will talk about methods to address this challenge.

Advertisements   
   

The course consists of two main sections as follows. In the two sections, I will talk about Efficient Transformer Models, Efficient Transformer benchmark and a Comparison of various efficient Transformer methods.

In the first section, I will talk about methods like Star Transformers, Sparse Transformers, Reformer, Longformer, Linformer, Synthesizer.

In the second section, I will talk about methods like ETC (Extended Transformer Construction), Big bird, Linear attention Transformer, Performer, Sparse Sinkhorn Transformer, Routing transformers. Long Range Arena is a recent benchmark for evaluating models on long sequence tasks with respect to accuracy, memory usage and inference time. We will discuss details about long range arena and finally wrap up with a philosophical categorization of various efficient Transformer methods.

For each method, we will discuss specific scheme for optimization, architecture and results obtained for pretraining as well as downstream tasks.

Who this course is for:

  • Beginners in deep learning
  • Python developers interested in data science concepts
  • Masters or PhD students who wish to learn deep learning concepts quickly
  • Folks wanting to ship their products across regions and languages (internationalization of their learning/predictive/generative models)

Deep Learning for NLP – Part 5, Free Tutorials Download

Download Deep Learning for NLP – Part 5 Free Tutorials Direct Links

Go to Download Tutorials Page Go to HomePage Tutorials

Password : freetutsdownload.net

Author: Ho Quang Dai

I am Ho Quang Dai, from Vietnam – A country that loves peace. I share completely free courses from major academic websites around the world. Hope to bring free knowledge to everyone who can’t afford to buy


Related Courses

Subscribe
Notify of
guest
0 Comments
Inline Feedbacks
View all comments

Report Link Die

Please provide the most detailed information, we will re-upload as soon as possible