Course Description

This course will give a research-oriented overview of the key concepts in Natural Language Processing (NLP) and the techniques used for statistical modeling of natural language data. We will introduce and discuss several NLP tasks, such as sentiment analysis, information extraction, syntactic parsing and semantic analysis. From a machine learning perspective these tasks can be viewed as instances of a similar problem, structured output learning, mapping from raw unstructured text to structured information. We will discuss several structured learning models, inference algorithms and training paradigms.

Time and Place

  • Tuesday, Thursday, 9-10:15am
    Lawson Computer Science Bldg 1106

Staff

  • Instructor: Dan Goldwasser
    Office Hours: after class
    Email:dgoldwas@purdue.edu
  • TA: Xiao (Cosmo) Zhang
    Office Hours: Monday 4-5 pm
    Email:zhang923@purdue.edu

Class Policies

  • Grading

    Homework 30%
    Final 35%
    Paper Review+Presentation 15% (see note)
    Project 20%
    Participation Bonus* 10%
    .

  • Late Policies

    You have 24 late hours overall. Use them wisely! After the grace period, you will lose 10 points for every 12 hours.

  • Code of Conduct

    Piazza:
    short version: be nice, keep it professional.
    Long version

    Cheating:
    short version -- don't!
    Long version

Materials

This is a tentative (and ambitious) list of topics.
Topic Slides
1

Introduction to NLP, Language Modeling.

Relevant Reading:
SLP, chapter 4
Linguistic background

Lecture Slides: 1, 2
2

Text Classification using NB, Perceptron, LR, SVM.

Relevant Reading:
CIML, Chapter 7

Lecture Slides: 1, 2,3
3

Introduction to NN: Feedforward Networks, Backprop, DL Frameworks tutorial

Lecture Slides: 1, 2
4

Word Representations and Lexical Semantics

Lecture Slides: ,1
5

Introduction to Structured Prediction: sequence models

Lecture Slides: 1
6

Advanced Structure Prediction: beyond sequence models

Lecture Slides: 1
7

Constituency and Dependency Parsing

Lecture Slides: 1
8

Discourse level tasks: Coreference Resolution, Discourse parsing

Lecture Slides: 1, 2
9

Deep Structured Prediction: RNN, LSTM, GRU, Recursive networks, combined with Inference

Lecture Slides: 1/td>
10

Relation Extraction and Semantics

Lecture Slides: 1
11

Textual Inference

Lecture Slides: 1, 2
12

Reinforcement Learning for NLP

Lecture Slides: 1, 2

How to Present and Review papers?

here are some papers
Part of your grade will be determined by a paper presentation and a paper review. These assignments are designed to help you understand how to go beyond the "standard" methods and frame new problems and solutions.

Paper Presentation: You will have to present one paper (out of the list above). The presentation should take around 10-15 minutes, and discuss the following points:
(1) Motivation: why is this an interesting/important problem? What is the first thing you would try, and why isn't it good enough? Make sure to provide context by pointing relevant works/models
(2) Key contributions: What are the novel aspects of the paper? these could be a new problem definition, new technical approach/algorithm, etc. It's important to get this right
(3)Evaluation: how was the idea evaluated? are you convinced by the evaluation?
(4)Your own thoughts: did you like the paper? Questions that you think should have been addressed? be critical.

Paper Review: You can review a paper if you are not presenting it AND before it was presented by someone else. Note: I will expect you to actively participate when the paper you reviewed is presented!
Paper reviews should be no more than a page, and should not plagiarize the original paper (i.e., use your own words!)
Your review should -
(1) Start with a 1-paragraph abstract summarizing the paper.
(2) Identify the paper's key scientific question and list its main contributions
(3) Summarize the technical contribution and identify relevant works (or at least the relevant models discussed in class)
(4) Explain the evaluation setup and its outcomes.
(5) Summarize the strength and weaknesses of the paper.

Reading Materials and Additional References

Speech and Language Processing, Dan Jurafsky and James H. Martin (recommended but not required)
Neural Network Methods for Natural Language Processing

Other Relevant Stuff
100 Things You Always Wanted to Know about Linguistics But Were Afraid to Ask
Structured Learning and Prediction in Computer Vision
Linguistic Structured Prediction