Welcome back. In this video, you'll learn how to implement sequence models, in TensorFlow, and if you've heard of models like an RNN, or Recurrent Neural Network, this week we'll teach you how to implement that. It's important to really take words, and compute embeddings for them, but the relative ordering, the sequence of words matters too for the meaning of a sentence, and if you jumble the words around, that changes, or destroys the meaning of a sentence. Definitely, and so if you take just a very simple sentence, like my dog sat on my hat, and you swap the words dog, and hat, you're now changing from a sentence that has some kind of meaning, and some semantics, to a ridiculous sentence that has no meaning, right? My hat sat on my dog, unless you have a really smart hat, it doesn't really mean anything. So when we were doing classification based on embeddings, it was very nice that we had words with similar meanings, labels in a particular way, so that we could say okay, this is a positive review, and I got a bunch of vectors that are similar, this is a negative review, and I got these vectors in a similar way, but ordering those then gives us that whole extra layer of meaning that'll help us to understand the sentence rather than just a simple classification. So for a neural network, to take into account the ordering of the words, people now use specialized Neural Network Architectures, things like an RNN, or GIO, or LSTM, and you see what all these terms mean, and a little bit, in order for these Specialized Neural Networks to process natural language. Yeah. So like something like an RNN, it's really interesting that, the context is preserved from timestamp to timestamp, which can be really useful, but that might get lost in longer sentences, and that's why I really love LSTMs because LSTMs have that cell state, and the cell state are almost like a conveyor belts carrying contexts all the way down, for a long way down timestamp. I love that you have a favorite flavor on how sequence follow. I hold no shame, and keeping favorites. So then, in a really long sentence, like for example, I grew up in Ireland, so I went to school, and at school, they made me learn how to speak something. It's like you could say, okay the context from speak, means that it's a language, but you have to go all the way back to the beginning of the sentence to see that it's Ireland, and then in Ireland, I would learn how to speak. You speak Gaelic. Really badly. Since we learn to invent all of these things. Let's go on to the next video.