Natural Language Processing Syllabus PDF
Document Details
Uploaded by GroundbreakingCottonPlant
Tags
Related
- Nikhil's Resume PDF
- AAI2007 Artificial Intelligence in Business and Society PDF
- SwinTRG: Swin Transformer Based Radiology Report Generation for Chest X-rays PDF
- NVIDIA Certified Associate: Gen AI and LLMs Cheat Sheet PDF
- 2b3da1d0-5dfe-11ef-8703-8e1c4cca5750.pdf
- Knowledge Representation and Reasoning for AI PDF
Summary
This document is a syllabus for a Natural Language Processing course. It outlines the course content, including units on deep learning, text preprocessing, text classification, POS tagging, and deep learning in NLP. The syllabus is intended for undergraduates.
Full Transcript
**Natural Language Processing** **Course Content:** **Unit I: 5 lecture hours** **Basics of Deep Learning:** Basic Concept of Natural Language Processing, Understanding of Text data and Speech data. Application of Natural Language Processing. **Unit II: 6 lecture hours** **Text Preprocessing T...
**Natural Language Processing** **Course Content:** **Unit I: 5 lecture hours** **Basics of Deep Learning:** Basic Concept of Natural Language Processing, Understanding of Text data and Speech data. Application of Natural Language Processing. **Unit II: 6 lecture hours** **Text Preprocessing Techniques:** Tokenization, Stemming, Lemmatization, Regular Expression, Bag of words, Tf-idf, N-grams, Bi-grams, Uni-grams. **Unit III: 10 lecture hours** **Text Classification:** **Basic use of Machine Learning in Natural Language processing, Word Embedding, CBOW, Skip-gram, Word2Vec. Probabilistic Learning:** Introduction to Probability, Conditional Probability, Bayesian Learning, Bayes Optimal classifier, Naïve Bayes Classifier. Sentiment Analysis. **Unit IV** **9 lecture hours** **POS Tagging:** Syntax analysis Part-of-Speech tagging, Rule base POS tagging, Stochastic POS tagging, Sequence labeling: Hidden Markov model. **Unit V 15 lecture hours** **Deep Learning in NLP:** Application of RNN in Text Data. Recurrent Neural Network, Variants of RNN: Longest Short term memory, Gated Recurrent Unit. Bidirectional RNN, Sequence to Sequence learning: Encoder and Decoder. Attention model architecture. Transformers.