Title: Recognizing Contextual Polarity in PhraseLevel Sentiment Analysis
1Recognizing Contextual Polarity in Phrase-Level
Sentiment Analysis
- Theresa Wilson
- Janyce Wiebe
- Paul Hoffmann
- University of Pittsburgh
2Introduction
- Sentiment analysis
- task of identifying positive and negative
opinions, emotions, and evaluations - How detailed? Depends on the application.
- Flame detection, review classification ?
document-level analysis - Question answering, review mining ?
sentence or phrase-level analysis
3Question Answering Example
Q What is the international reaction to the
reelection of Robert Mugabe as President of
Zimbabwe?
African observers generally approved of his
victory while Western Governments denounced it.
4Prior Polarity versus Contextual Polarity
- Most approaches use a lexicon of positive and
negative words - Prior polarity out of context, positive or
negative - beautiful ? positive
- horrid ? negative
- A word may appear in a phrase that expresses a
different polarity in context -
- Contextual polarity
Cheers to Timothy Whitfield for the wonderfully
horrid visuals.
5Example
- Philip Clap, President of the National
Environment Trust, sums up well the general
thrust of the reaction of environmental
movements there is no reason at all to believe
that the polluters are suddenly going to become
reasonable.
6Example
- Philip Clap, President of the National
Environment Trust, sums up well the general
thrust of the reaction of environmental
movements there is no reason at all to believe
that the polluters are suddenly going to become
reasonable.
7Example
- Philip Clap, President of the National
Environment Trust, sums up well the general
thrust of the reaction of environmental
movements there is no reason at all to believe
that the polluters are suddenly going to become
reasonable.
Contextual polarity
prior polarity
8Goal of Our Research
- Automatically distinguish prior and contextual
polarity
9Approach
- Use machine learning and variety of features
- Achieve significant results for a large subset of
sentiment expressions
10Outline
- Introduction
- Manual Annotations
- Corpus
- Prior-Polarity Subjectivity Lexicon
- Experiments
- Previous Work
- Conclusions
11Manual Annotations
- Need sentiment expressions with contextual
polarity - ? positive and negative expressions of
emotions, evaluations, stances - Had subjective expression annotations in
MPQA Opinion Corpus http//nrrc.mitre.org/NRRC/pub
lications.htm - ? words/phrases expressing
emotions, evaluations, stances,
speculations, etc. - sentiment expressions ? subjective expressions
- Decision annotate subjective expressions in MPQA
Corpus with their contextual polarity
12Annotation Scheme
- Mark polarity of subjective expressions as
positive, negative, both, or neutral
positive
African observers generally approved of his
victory while Western governments denounced it.
negative
Besides, politicians refer to good and evil
both
Jerome says the hospital feels no different than
a hospital in the states.
neutral
13Annotation Scheme
- Judge the contextual polarity of sentiment
ultimately being conveyed - They have not succeeded, and will never succeed,
in breaking the will of this valiant people.
positive
14Agreement Study
- 10 documents with 447 subjective expressions
- Kappa 0.72 (82)
- Remove uncertain cases ? at least one annotator
marked uncertain (18) - Kappa 0.84 (90)
- (But all data included in experiments)
15Outline
- Introduction
- Manual Annotations
- Corpus
- Prior-Polarity Subjectivity Lexicon
- Experiments
- Previous Work
- Conclusions
16Corpus
- 425 documents from MPQA Opinion Corpus
- 15,991 subjective expressions in 8,984 sentences
- Divided into two sets
- Development set
- 66 docs / 2,808 subjective expressions
- Experiment set
- 359 docs / 13,183 subjective expressions
- Divided into 10 folds for cross-validation
17Outline
- Introduction
- Manual Annotations
- Corpus
- Prior-Polarity Subjectivity Lexicon
- Experiments
- Previous Work
- Conclusions
18Prior-Polarity Subjectivity Lexicon
- Over 8,000 words from a variety of sources
- Both manually and automatically identified
- Positive/negative words from General Inquirer and
Hatzivassiloglou and McKeown (1997) - All words in lexicon tagged with
- Prior polarity positive, negative, both, neutral
- Reliability strongly subjective (strongsubj),
weakly subjective (weaksubj)
19Outline
- Introduction
- Manual Annotations
- Corpus
- Prior-Polarity Subjectivity Lexicon
- Experiments
- Previous Work
- Conclusions
20Experiments
- Both Steps
- BoosTexter AdaBoost.HM 5000 rounds boosting
- 10-fold cross validation
- Give each instance its own label
21Definition of Gold Standard
- Given an instance inst from the lexicon
- if inst not in a subjective expression
- goldclass(inst) neutral
- else if inst in at least one positive and one
negative subjective expression - goldclass(inst) both
- else if inst in a mixture of negative and
neutral - goldclass(inst) negative
- else if inst in a mixture of positive and
neutral - goldclass(inst) positive
- else goldclass(inst) contextual polarity of
subjective expression
22Features
- Many inspired by Polanya Zaenen (2004)
Contextual Valence Shifters - Example little threat
- little truth
- Others capture dependency relationships between
words - Example
- wonderfully horrid
pos
mod
23- Word features
- Modification features
- Structure features
- Sentence features
- Document feature
24- Word token
- terrifies
- Word part-of-speech
- VB
- Context
- that terrifies me
- Prior Polarity
- negative
- Reliability
- strongsubj
- Word features
- Modification features
- Structure features
- Sentence features
- Document feature
25- Word features
- Modification features
- Structure features
- Sentence features
- Document feature
- Binary features
- Preceded by
- adjective
- adverb (other than not)
- intensifier
- Self intensifier
- Modifies
- strongsubj clue
- weaksubj clue
- Modified by
- strongsubj clue
- weaksubj clue
Dependency Parse Tree
26- Word features
- Modification features
- Structure features
- Sentence features
- Document feature
- Binary features
- In subject
- The human rights report poses
- In copular
- I am confident
- In passive voice
- must be regarded
27- Word features
- Modification features
- Structure features
- Sentence features
- Document feature
- Count of strongsubj clues in
- previous, current, next sentence
- Count of weaksubj clues in
- previous, current, next sentence
- Counts of various parts of speech
28- Document topic (15)
- economics
- health
-
- Kyoto protocol
- presidential election in Zimbabwe
- Word features
- Modification features
- Structure features
- Sentence features
- Document feature
Example The disease can be contracted if a
person is bitten by a certain tick or if a person
comes into contact with the blood of a congo
fever sufferer.
29Results 1a
30Results 1b
31Step 2 Polarity Classification
19,506
5,671
- Classes
- positive, negative, both, neutral
32- Word token
- Word prior polarity
- Negated
- Negated subject
- Modifies polarity
- Modified by polarity
- Conjunction polarity
- General polarity shifter
- Negative polarity shifter
- Positive polarity shifter
33- Word token
- Word prior polarity
- Negated
- Negated subject
- Modifies polarity
- Modified by polarity
- Conjunction polarity
- General polarity shifter
- Negative polarity shifter
- Positive polarity shifter
- Word token
- terrifies
- Word prior polarity
- negative
34- Word token
- Word prior polarity
- Negated
- Negated subject
- Modifies polarity
- Modified by polarity
- Conjunction polarity
- General polarity shifter
- Negative polarity shifter
- Positive polarity shifter
- Binary features
- Negated
- For example
- not good
- does not look very good
- not only good but amazing
-
- Negated subject
- No politically prudent Israeli could support
either of them.
35- Word token
- Word prior polarity
- Negated
- Negated subject
- Modifies polarity
- Modified by polarity
- Conjunction polarity
- General polarity shifter
- Negative polarity shifter
- Positive polarity shifter
- Modifies polarity
- 5 values positive, negative, neutral, both, not
mod - substantial negative
- Modified by polarity
- 5 values positive, negative, neutral, both, not
mod - challenge positive
36- Word token
- Word prior polarity
- Negated
- Negated subject
- Modifies polarity
- Modified by polarity
- Conjunction polarity
- General polarity shifter
- Negative polarity shifter
- Positive polarity shifter
- Conjunction polarity
- 5 values positive, negative, neutral, both, not
mod - good negative
37- Word token
- Word prior polarity
- Negated
- Negated subject
- Modifies polarity
- Modified by polarity
- Conjunction polarity
- General polarity shifter
- Negative polarity shifter
- Positive polarity shifter
- General polarity shifter
- pose little threat
- contains little truth
- Negative polarity shifter
- lack of understanding
- Positive polarity shifter
- abate the damage
38Results 2a
39Results 2b
40- Ablation experiments removing features
- Negated, negated subject
- Modifies polarity, modified by polarity
- Conjunction polarity
- General, negative, positive polarity shifters
41Outline
- Introduction
- Manual Annotations
- Corpus
- Prior-Polarity Subjectivity Lexicon
- Experiments
- Previous Work
- Conclusions
42Previous Work
- Learn prior polarity of words and phrases
- e.g., Hatzivassiloglou McKeown (1997), Turney
(2002) - Sentence-level sentiment analysis
- e.g., Yu Hatzivassiloglou (2003), Kim Hovy
(2004) - Phrase-level contextual polarity classification
- e.g., Yi et al. (2003)
43At HLT/EMNLP 2005
- Popescu Etizioni Extracting Product Features
and Opinions from Reviews - Choi, Cardie, Riloff Patwardhan Identifying
Sources of Opinions with Conditional Random
Fields and Extraction Patterns - Alm, Roth Sproat Emotions from Text Machine
Learning for Text-based Emotion Prediction
44Outline
- Introduction
- Manual Annotations
- Corpus
- Prior-Polarity Subjectivity Lexicon
- Experiments
- Previous Work
- Conclusions
45Conclusions
- Presented a two-step approach to phrase-level
sentiment analysis - Determine if an expression is neutral or polar
- Determines contextual polarity of the ones that
are polar - Automatically identify the contextual polarity of
a large subset of sentiment expression
46Thank you
47Acknowledgments
- This work was supported by
- Advanced Research and Development Activity (ARDA)
- National Science Foundation