In this project, we will cover in detail the architecture of a transformer used in natural language processing use cases. We will go through the key nlp areas in the pre-transformer stage like bow, word2vec...and then the origin and gradual refinement of transformers. Finally, we will study one of the most popular state of the art transformer models, called BERT and use it for text classification on a large dataset.
Want to search images of clothes which have text on them? Then this project talks through how we can classify an image whether it has text on it or not. For this we use state of the model called as inception and try and deepdive into how it works on our dataset