Chevron Left
Back to Natural Language Processing with Attention Models

Learner Reviews & Feedback for Natural Language Processing with Attention Models by DeepLearning.AI

4.4
stars
968 ratings

About the Course

In Course 4 of the Natural Language Processing Specialization, you will: a) Translate complete English sentences into German using an encoder-decoder attention model, b) Build a Transformer model to summarize text, c) Use T5 and BERT models to perform question-answering, and d) Build a chatbot using a Reformer model. By the end of this Specialization, you will have designed NLP applications that perform question-answering and sentiment analysis, created tools to translate languages and summarize text, and even built a chatbot! Learners should have a working knowledge of machine learning, intermediate Python including experience with a deep learning framework (e.g., TensorFlow, Keras), as well as proficiency in calculus, linear algebra, and statistics. Please make sure that you’ve completed course 3 - Natural Language Processing with Sequence Models - before starting this course. This Specialization is designed and taught by two experts in NLP, machine learning, and deep learning. Younes Bensouda Mourri is an Instructor of AI at Stanford University who also helped build the Deep Learning Specialization. Łukasz Kaiser is a Staff Research Scientist at Google Brain and the co-author of Tensorflow, the Tensor2Tensor and Trax libraries, and the Transformer paper....

Top reviews

JH

Oct 4, 2020

Can the instructors make maybe a video explaining the ungraded lab? That will be useful. Other students find it difficult to understand both LSH attention layer ungraded lab. Thanks

SB

Nov 20, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

Filter by:

26 - 50 of 237 Reviews for Natural Language Processing with Attention Models

By JL B

Nov 8, 2020

Maybe my fault but at some point in these courses I got lost in the logic and the whys of the networks constructions. I managed the assignments because for some to pass you only need to know how to copy and paste.

But I reckon the great value of the material, I think I'll need to revisit and spend more time on the optional readings.

And still overall a great specialization, thanks to all the persons involved in these courses !

By Z F

Sep 27, 2022

The Deep-Learning framework Trax in this course only increases unnecessary difficulties for finishing the assignment. I don't understand why they did not use more common frameworks such as PyTorch and Tensorflow. It seems that the instructor only read after the script while presenting the slides. For example, there was an obvious error on the slide of the transformer decoder, and the instructor did not correct it.

By Israel T

Oct 7, 2020

Very educational! I learned a lot about the different NLP models. However, it seems like week 3 and week 4 were rushed. Also, some of the items (e.g. what each layers does and why do we need that layer) were not properly explained. Other than that, this is a good course to have a general overview on some of the state of the art NLP models.

By Mark L

Oct 2, 2021

(1) Please consider switching from Trax to Tensorflow. (2) The concepts of Transformers, particularly some explanation of why Q, K and V are called such, would be helpful to go over in more detail. (3) Not a problem of the course, but it would be helpful if the Trax documentation were more complete.

By Felix M

Apr 11, 2021

The classes originally taught by Andrew were for me much better. Many of the explanations in this course were not very clear and superficial as I see it.

By Damian S

Feb 24, 2022

Course content is fantanstic, but assignments are ridiculous--they test how well you can read directions, but not how well you understand the content.

By Haoyu R

Oct 2, 2020

Not as details as enough. The quality of the course is very good at the start but decreases as the topics go deeper.

By Kévin S

Feb 15, 2022

Look like an commercial AD for Trax. I don't know if I will be able to re-implement this in another framework.

By Darren

Feb 7, 2022

The general content is good, but there are so many insonsistencies and missing pieces of information in the material. Terms are poorly defined and used inconsistently. Lots of information about "why" certain things are the way they are in the programming assignments is missing -- you just "do it" without understanding it. Also, the instructors have abandoned the course forums. Lots of questions about content in the discussion forums, but none of the content creators are helping answer the questions. We're just left to fend for ourselves. Not worth the money. Just watch the videos.

By Lucky S

Feb 24, 2022

This Course is the weakest course of this Specialization.

Course 1 - 3 was very strong and solid. But Course 4 feels very rushed. The Curriculum is very hard to follow, let alone to understand. The Lab wasn't commented enough to give us proper explanation (Especially week 4). There are a lot of concept that isn't explained at great length when it should.

By Valerio G

Mar 24, 2021

I'm very disappointed with the whole deeplearning.ai NLP specialization in general, but this course was icing on the cake.

The course treats advanced and state-of-art techniques in NLP with neural neutworks, but the theoretical lectures are confusing and imprecise. The framework programming assignments are totally useless, since the user is asked to implement the network architectures discussed in the lectures using a "fill the dots" approach with a very restrictive starter structure. In my personal experience, this yielded a close-to-zero learning outcome, but a lot of frustration in trying to get around some bugs in the auto-grading system, by desperately browsing in the posts from the learners community.

I came here after the very nice Deep Learning Specialization held by Andrew Ng and wasn't expecting this.

By George L

Apr 11, 2021

Younes is a bad teacher. He may have good technical chops, but teaching is a different skill altogether. Overall, the NLP specialization design is much much worse compared with the DL Specialization. On one hand, you were taught a lot of stuff that are deep but cursory, on the other hand, the excises are either too difficult for you to get any clue or most of the time actually too simple and you only need to enter simple parameters, therefore cannot really learn anything! I really don't know why there are so many people giving 5 star rating!

By Yuri C

Jan 6, 2021

The last course in the NLP specialization is intense! Already in the first week the learner is put through its tensor algebra baptism and it goes even deeper while building the locality-sensite hashing inner workings. I am very grateful to the team to have put so much effort in teaching us how attention works and how to improve it in building the Reformer model. The opportunity to get this material from some of the developers of the model is priceless! Thank you for that! Surely, in everyday NLP one uses directly the layers provided by Trax mostly. But the understanding about the drawbacks and the ideas behind these models is indeed the unique selling proposition of this whole course. The provided infographics are deeply helpful for understanding what goes on with the tensors inside the models and the instructors do the best to introduce those ideas throughout the course. I was also *very* impressed to see how much up-to-date all the material of this latest course is! Some of the papers about the models were put in arXiv 1-2 years ago. This is by far very hard to beat in any massive open online course! Thank you very much for providing this for the community at a such an accessible price tag. I will be eagerly waiting for a continuation of this specialization as Advanced NLP!

By SNEHOTOSH K B

Nov 21, 2020

The course is a very comprehensive one and covers all state-of-the-art techniques used in NLP. It's quite an advanced level course and a good python coding skill is a must.

By D B

Jan 25, 2023

I learned a lot from this course, and the ungraded and graded problems are relevant to understanding and knowing how to build a transformer or a reformer from scratch

By satish b

Jan 1, 2021

One of the best course I have ever taken. The course provides in-depth learning of transformers from the creators of Transformers.

By Jonathan M

Nov 16, 2020

The course was wonderful, full of updated content and explained in a really good way. Good work!

By Akash M

Sep 26, 2020

Outstanding Course. The course was rigorous

By Sarkis K

Apr 17, 2023

The courses have really enlightened me on NLP. I had no idea about the techniques. I'll give it 4 stars, because the course instructors have a monotonicity of lecturing as if reading from a teleprompter with a fake synthetic voice. It sometimes gives me a headache and I end up muting the videos and just reading the subtitles (which a lot of times don't make sense and are short paces so I have to freeze the screen, and open 2 other windows and read the lower caption text). I have been doing many courses on this platforms, and even though the instructors are on the top of their fields, but the way they deliver the courses is just "sometimes" and "not always" painful. I am sure this is not how they teach there own classes, especially in Stanford. Even though the course is 50$ per month, a think it won't cost the instructors much to show some authentic enthusiasm.

By Simon P

Dec 6, 2020

The course could have been expanded to an entire specialization. There's a little too much information and the first two assignments are disproportionately long and hard compared with the last two. It is cutting edge material though, and well worth it.

Slight annoyance at the script reading meaning the videos lack a natural flow and you end up with nonsense sentences like "now we multiply double-uwe sub en superscript dee by kay sub eye superscript jay to get vee sub eye". Variables such as X_i should be referred to by what they actually represent and not the algebraic representation, because this is not how the brain processes them when they are read from a page.

By Dave J

May 3, 2021

The content is interesting and current, citing some 2020 papers. I was disappointed by the amount of lecture material - around 40-45 minutes per week in weeks 1-3 and only 20 minutes in week 4, plus two Heroes of NLP interviews. The lectures have the feel of reading from a script rather than engaging with the learner. They're not bad but there's room for improvement. Explanations are usually adequate but some areas could have been explained more clearly.

Programming assignments worked smoothly in my experience, though not particularly challenging: they're largely "painting by numbers".

By Rukang X

Sep 24, 2021

It would be better using TensorFlow as an implementation tool of these cutting edge algorithms for its popularity both in academia and industry.

By Amit J

Jan 2, 2021

Though the content is extremely good and cutting edge, the course presentation/instructor hasn't been able to do justice to the course. [1] Teaching concepts through assignments (and not covering them in detail in lectures is) an absolutely bad idea. [2] Lecture instructions are ambiguous and immature at times. Instructor is an excellent engineer but a bad teacher is very evident from the way of presentation. [4] Only if input output dimensions were mentioned at every boundary in network illustrations, would have made a lot of difference in terms of speed of understanding without having to hunt through off-line material and papers. [5] Using Trax I think is not a good idea for this course. The documentation is kind of non-existent and lot of details of functions are hidden and the only way to understand them is to look at the code. A more established framework like Tensorflow or pytorch would have been much more helpful.

Overall a disappointment given the quality of other courses available from Coursera.

By Laurence G

Apr 11, 2021

Pros: Good choice of content coverage. Provides a historic overview of the field, covering the transition from early work on seq2seq with LSTMs, through the early forays into Attention, to the more modern models first introduced in Veswani et al. Week 4 covers the Reformer model which was quite exciting. Decent labs

Cons: Videos aren't great, there are a lot of better resources out there, many actually included in the course's reference section. Trax is not a good framework for learners in comparison to Pytorch, but if you plan on using TPUs and appreciate the pure functional style and stack semantics then it's worthwhile. The labs can be a bit copy-pasty. Some of the diagrams are awful - find other resources if this is a problem.

Overall: I'd probably rate this course a 3.5 but wouldn't round up. The videos really let things down for me, but I persisted because the lesson plan and labs were pretty good.

By Christine D

Jan 22, 2021

Even though the theory is very interesting, and well explained the videos dive too deep in certain concepts without explaining the practical things you can do with them too very well.

The practical stuff, especially the graded assignments, are very centered around Trax, and the only things you have to know and understand are basic python and logic. You don't really get to make your own stuff, you just fill in stuff like "temperature=temperature" or "counter +=1".

I preferred and recommend the first two courses in this NLP-specialization.