Abstract of the Talk
With the advent of Deep Learning, Natural language Processing (NLP) is undergoing a revolutionary change. Every few months, if not weeks, a new DL model sets fresh benchmarks for existing NLP challenges and thereby creating more tougher ones. A fair share of this credit goes to the models' ability to represent the textual meaning at the phrasal, sentence, and even higher-levels beyond words. In this talk, we discuss the pros, cons, and intuition behind some popular deep learning models like Recurrent Neural networks (RNNs), Long Short Term Memory networks (LSTM), and Transformers. These models rely on compositional semantics, i.e., constructing complex meaning representations from the constituent word/token embedding (vectors). We hope to give a general overview of current DL trends in compositional semantics for natural language understanding.
About the speaker
Ms. Jeena KK
Research Scholar
Dept. of CSE, NIT Calicut
Webinar Schedule
Date and Time: 17 Aug 2020, 2:30 PM