Anuja TayalReview: Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word EmbeddingsNow a days, as we blindly train machine learning models without understanding what these models are learning, and just focusing on the…2 min read·Aug 6, 2022----
Anuja TayalReview: Infusing Fine Tuning with Semantic DependenciesPrevious models were application specific which pre-trained on large corpus but it is still unknown whether they actually capture the…2 min read·Aug 6, 2022--1--1
Anuja TayalReview: Neural Machine Translation by Jointly Learning to Align and TranslateThis landmark paper first laid the foundation of the concept of attention in the domain of machine translation that is being used in…3 min read·Aug 6, 2022----
Anuja TayalBERT: Pre-training of Deep Bidirectional Transformers for Language UnderstandingResearchers at Google proposed state-of-the-art model BERT (Bidirectional Encoder Representation from Transformers) to improve fine-tuned…2 min read·Aug 6, 2022----
Anuja TayalIncluding Signed Language in Natural Language ProcessingAuthors in ACL’21 in their position paper wants the NLP community to expand and include Sign Language, a primary means of communication for…2 min read·Jul 18, 2022----
Anuja TayalE2E-VLP: End-to-End Visual-Language Pre-training Enhanced by Visual LearningUntil now pre-training for cross-modal downstream tasks included only image-based features and not text-based features. Image-based…2 min read·Jul 18, 2022--1--1
Anuja TayalLearn Language Processing- How a Baby DoesThere is a great Chinese proverb- To learn a language is to have one more window from which to look at the world. That’s how great a…3 min read·Nov 25, 2020--1--1