SlideShare a Scribd company logo
Natural Language Processing
2 
Why “natural language”? 
 Natural vs. artificial 
 Language vs. English
3 
Why “natural language”? 
 Natural vs. artificial 
 Not precise, ambiguous, wide range of 
expression 
 Language vs. English 
 English, French, Japanese, Spanish
4 
Why “natural language”? 
 Natural vs. artificial 
 Not precise, ambiguous, wide range of 
expression 
 Language vs. English 
 English, French, Japanese, Spanish 
 Natural language processing = programs, 
theories towards understanding a problem 
or question in natural language and 
answering it
5 
Approaches 
 System building 
 Interactive 
 Understanding only 
 Generation only 
 Theoretical 
 Draws on linguistics, psychology, 
philosophy
6 
 Building an NL system is hard 
 Unlikely to be possible without solid 
theoretical underpinnings
7 
Natural language is useful 
 Question-answering systems 
 http://tangra.si.umich.edu/clair/NSIR/NSIR.cgi 
 Mixed initiative systems 
 http://www.cs.columbia.edu/~noemie/match.mpg 
 Information extraction 
 http://nlp.cs.nyu.edu/info-extr/biomedical-snapshot.jpg 
 Systems that write/speak 
 http://www-2.cs.cmu.edu/~awb/synthesizers.html 
 MAGIC 
 Machine translation 
 http://world.altavista.com/babelfish
8 
Topics 
 Syntax 
 Semantics 
 Pragmatics 
 Statistical NLP: combining learning 
and NL processing
9 
Goal of Interpretation 
 Identify sentence meaning 
 Do something with meaning 
 Need some representation of 
action/meaning
10 
Analysis of form: Syntax 
 Which parts were damaged by larger 
machines? 
 Which parts damaged larger machines? 
 Which larger machines damaged parts? 
 Approaches: 
 Statistical part of speech tagging 
 Parsing using a grammar 
 Shallow parsing: identify meaningful 
chunks
11 
Which parts were damaged by larger 
machines? 
S (Q) 
NP VP 
N NP (Q) 
machines 
V (past) 
damage Det (Q) N 
which parts 
ADJ 
larger
12 
Which parts were damaged by 
machines? – with functional roles 
S (Q) 
NP (SUBJ) VP 
N NP (Q) (OBJ) 
machines 
V (past) 
damage Det (Q) N 
which parts 
ADJ 
larger
13 
Which parts damaged machines? – with 
functional roles 
S (Q) 
NP (OBJ) 
VP 
N 
machines 
V (past) 
damage 
parts 
NP (Q) 
(SUBJ) 
Det (Q) N 
which 
ADJ 
larger
14 
Parsers 
 Grammar 
 S -> NP VP 
 NP -> DET {ADJ*} N 
 Different types of grammars 
 Context Free vs. Context Sensitive 
 Lexical Functional Grammar vs. Tree Adjoining 
Grammars 
 Different ways of acquiring grammars 
 Hand-encoded vs. machine learned 
 Domain independent (TreeBank, Wall Street 
Journal) 
 Domain dependent (Medical texts)
15 
Semantics: analysis of meaning 
 Word meaning 
 John picked up a bad cold 
 John picked up a large rock. 
 John picked up Radio Netherlands on his radio. 
 John picked up a hitchhiker on Highway 66. 
 Phrasal meaning 
 Baby bonuses -> allocations 
 Senior citizens -> personnes agees 
 Causing havoc -> seme le dessaroi 
 Approaches 
 Representing meaning 
 Statistical word disambiguation 
 Symbolic rule-based vs. shallow statistical 
semantics
16 
Representing Meaning - WordNet
17
18 
OMEGA 
 http://omega.isi.edu:8007/index 
 http://omega.is.edu/doc/browsers.html
19
Statistical Word Sense Disambiguation 
20 
Context within the sentence determines which sense is 
correct 
 The candidate picked up [sense6] thousands of 
additional votes. 
 He picked up [sense2] the book and started to read. 
 Her performance in school picked up [sense13]. 
 The swimmers got out of the river and climbed the 
bank [sloping land] to retrieve their towels. 
 The investors took their money out of the bank 
[financial institution] and moved it into stocks and 
bonds.
21 
Goal 
 A program which can predict which sense 
is the correct sense given a new sentence 
containing “pick up” or “bank” 
 Avoid manually itemizing all words which 
can occur in sentences with different 
meanings 
 Can we use machine learning?
22 
What do we need? 
 Data 
 Features 
 Machine Learning algorithm 
 Decision tree vs. SVM/Naïve Bayes 
 Inspecting the output 
 Accuracy of these methods
23 
Using Categories from Roget’s 
Thesaurus (e.g., machine vs. animal) 
for training
24 
Training data for “machines”
25
26 
Predicting the correct sense in unseen 
text 
 Use presence of the salient words in 
context 
 50 word window 
 Use Baye’s rule to compute 
probabilities for different categories
27 
“Crane” 
 Occurred 74 times in Grolliers, 36 
as animal, 38 as machine 
 Prediction in new sentences were 
99% correct 
 Example: lift water and to grind 
grain .PP Treadmills attached to 
cranes were used to lift heavy 
objects from Roman times.
28
29
30 
Going Home – A play in one act 
 Scene 1: Pennsylvania Station, NYC 
Bonnie: Long Beach? 
Passerby: Downstairs, LIRR Station 
 Scene 2: ticket counter: LIRR 
Bonnie: Long Beach? 
Clerk: $4.50 
 Scene 3: Information Booth, LIRR 
Bonnie: Long Beach? 
Clerk: 4:19, Track 17 
 Scene 4: On the train, vicinity of Forest Hills 
Bonnie: Long Beach? 
Conductor: Change at Jamaica 
 Scene 5: On the next train, vicinity of Lynbrook 
Bonnie: Long Beach? 
Conductor: Rigtht after Island Park.
31 
Question Answering on the web 
 Input: English question 
 Data: documents retrieved by a 
search engine from the web 
 Output: The phrase(s) within the 
documents that answer the question
32 
Examples 
 When was X born? 
When was Mozart born? 
Mozart was born in 1756. 
When was Gandhi born? 
 Gandhi (1869-1948) 
 Where are the Rocky Mountains 
located? 
 What is nepotism?
33 
Common Approach 
 Create a query from the question 
 When was Mozart born -> Mozart born 
 Use WordNet to expand terms and increase 
recall: 
 Which high school was ranked highest in the US in 
1998? 
 “high school” -> (high&school)| 
(senior&high&school)|(senior&high(|high| 
highschool 
 Use search engine to find relevant 
documents 
 Pinpoint passage within document that 
has answer using patterns 
 From IR to NLP
34 
PRODUCE A BIOGRAPHY OF [PERON]. 
Only these fields are Relevant: 
1. Name(s), aliases: 
2. *Date of Birth or Current Age: 
3. *Date of Death: 
4. *Place of Birth: 
5. *Place of Death: 
6. Cause of Death: 
7. Religion (Affiliations): 
8. Known locations and dates: 
9. Last known address: 
10. Previous domiciles: 
11. Ethnic or tribal affiliations: 
12. Immediate family members 
13. Native Language spoken: 
14. Secondary Languages spoken: 
15. Physical Characteristics 
16. Passport number and country of issue: 
17. Professional positions: 
18. Education 
19. Party or other organization affiliations: 
20. Publications (titles and dates):
35 
Biography of Han Ming 
 Han Ming, born 1944 March in Pyongyan, South 
Korean Lei Fa Women’s University in French law, 
literature, a former female South Korean people, 
chairman of South Korea women’s groups,…Han, 
62, has championed women’s rights and liberal 
political ideas. Han was imprisoned from 1979 to 
1981 on charges of teaching pro-Communist 
ideas to workers, farmers and low-income 
women. She became the first minister of gender 
equality in 2001 and later served as an 
environment minister.
36 
Biography – two approaches 
 To obtain high precision, we handle 
each slot independently using 
bootstrapping to learn IE patterns. 
 To improve the recall, we utilize a 
biography Language Model.
37 
Approach 
 Characteristics of the IE approach 
 Training resource: Wikipedia and its manual 
annotations 
 Bootstrapping interleaves two corpora to improve 
precision 
 Wikipedia: reliable but small 
 Web: noisy but many relevant documents 
 No manual annotation or automatic tagging of corpus 
 Use seed tuples (person, date-of-birth) to find patterns 
 This approach is scalable for any corpus 
 Irrespective of size 
 Irrespective of whether it is static or dynamic 
 The IE system is augmented with language models to 
increase recall
38 
Biography as an IE task 
 We need patterns to extract information 
from a sentence 
 Creating patterns manually is a time 
consuming task, and not scalable 
 We want to find these patterns 
automatically
39 
Biography patterns from Wikipedia
40 
Biography patterns from Wikipedia 
• Martin Luther King, Jr., (January 15, 1929 – April 4, 
1968) was the most … 
• Martin Luther King, Jr., was born on January 15, 1929, 
in Atlanta, Georgia.
41 
Run IdFinder on these sentences 
 <Person> Martin Luther King, Jr. </Person>, 
(<Date>January 15, 1929</Date> – <Date> 
April 4, 1968</Date>) was the most… 
 <Person> Martin Luther King, Jr. </Person>, was 
born on <Date> January 15, 1929 </Date>, in 
<GPE> Atlanta, Georgia </GPE>. 
 Take the token sequence that includes the tags of 
interest + some context (2 tokens before and 2 
tokens after)
42 
Convert to Patterns: 
 <My_Person> (<My_Date> – <Date>) was the 
 <My_Person> , was born on <My_Date>, in 
 Remove more specific patterns – if there is a 
pattern that contains other, take the smallest > k 
tokens. 
  <MY_Person> , was born on <My_Date> 
  <My_Person> (<My_Date> – <Date>) 
 Finally, verify the patterns manually to remove 
irrelevant patterns.
43 
Examples of Patterns: 
 502 distinct place-of-birth patterns: 
 600 <MY_Person> was born in <MY_GPE> 
 169 <MY_Person> ( born <Date> in <MY_GPE> ) 
 44 Born in <MY_GPE> <MY_Person> 
 10 <MY_Person> was a native <MY_GPE> 
 10 <MY_Person> 's hometown of <MY_GPE> 
 1 <MY_Person> was baptized in <MY_GPE> 
 … 
 291 distinct date-of-death patterns: 
 770 <MY_Person> ( <Date> - <MY_Date> ) 
 92 <MY_Person> died on <MY_Date> 
 19 <MY_Person> <Date> - <MY_Date> 
 16 <MY_Person> died in <GPE> on <MY_Date> 
 3 < MY_Person> passed away on < MY_Date > 
 1 < MY_Person> committed suicide on <MY_Date> 
 …
44 
Biography as an IE task 
 This approach is good for the 
consistently annotated fields in 
Wikipedia: place of birth, date of 
birth, place of death, date of death 
 Not all fields of interests are 
annotated, a different approach is 
needed to cover the rest of the slots
45 
Bouncing between Wikipedia and Google 
 Use one seed only: 
<my person> and <target field> 
 Google: “Arafat” “civil engineering”, we get:
46
47 
Bouncing between Wikipedia and Google 
 Use one seed only: 
 <my person> and <target field> 
 Google: “Arafat” “civil engineering”, we get: 
Þ Arafat graduated with a bachelor’s degree in civil engineering 
Þ Arafat studied civil engineering 
Þ Arafat, a civil engineering student 
Þ … 
 Using these snippets, corresponding patterns are 
created, then filtered out manually.
48 
Bouncing between Wikipedia and Google 
 Use one seed tuple only: 
 <my person> and <target field> 
 Google: “Arafat” “civil engineering”, we get: 
Þ Arafat graduated with a bachelor’s degree in civil 
engineering 
Þ Arafat studied civil engineering 
Þ Arafat, a civil engineering student 
Þ … 
 Using these snippets, corresponding patterns are 
created, then filtered out manually 
 To get more seed pairs, go to Wikipedia biography 
pages only and search for: 
 “graduated with a bachelor’s degree in” 
 We get:
49
50 
Bouncing between Wikipedia and Google 
 New seed tuples: 
 “Burnie Thompson” “political science“ 
 “Henrey Luke” “Environment Studies” 
 “Erin Crocker” “industrial and management 
engineering” 
 “Denise Bode” “political science” 
 … 
 Go back to Google and repeat the 
process to get more seed patterns!
51 
Bouncing between Wikipedia and Google 
 This approach worked well for a few 
fields such as: education, publication, 
Immediate family members, and Party or other 
organization affiliations 
 Did not provide good patterns for 
every field, such as: Religion, Ethnic or tribal 
affiliations, and Previous domiciles), we got a lot 
of noise 
 For some slots, we created some 
patterns manually
52 
Biography as Sentence Selection and Ranking 
 To obtain high recall, we also want to include 
sentences that IE may miss, perhaps due to ill-formed 
sentences (ASR and MT) 
 Get the top 100 documents from Indri 
 Extract all sentences that contain the person or 
reference to him/her 
 Use a variety of features to rank these 
sentence…

More Related Content

PDF
Introduction to natural language processing
PDF
Natural language processing (NLP) introduction
PPTX
Natural Language Processing
PPTX
Natural language processing
PDF
Natural Language Processing
PDF
Natural language processing
PPTX
Natural Language Processing
Introduction to natural language processing
Natural language processing (NLP) introduction
Natural Language Processing
Natural language processing
Natural Language Processing
Natural language processing
Natural Language Processing

What's hot (20)

PDF
Natural Language Processing (NLP)
PPTX
Natural Language Processing
PPT
Natural Language Processing
PPTX
natural language processing help at myassignmenthelp.net
PDF
Natural language processing
PPTX
Natural language processing PPT presentation
PPTX
Natural language processing (NLP)
PPTX
Natural language processing
PPT
Introduction to Natural Language Processing
PPTX
Introduction to natural language processing, history and origin
PPTX
Natural language processing and transformer models
PPT
Natural language procssing
PPTX
Natural Language Processing
PPTX
PPTX
Language models
PPT
Introduction to Natural Language Processing
PDF
Natural Language Processing
PDF
Natural Language Processing
ODT
A tutorial on Machine Translation
Natural Language Processing (NLP)
Natural Language Processing
Natural Language Processing
natural language processing help at myassignmenthelp.net
Natural language processing
Natural language processing PPT presentation
Natural language processing (NLP)
Natural language processing
Introduction to Natural Language Processing
Introduction to natural language processing, history and origin
Natural language processing and transformer models
Natural language procssing
Natural Language Processing
Language models
Introduction to Natural Language Processing
Natural Language Processing
Natural Language Processing
A tutorial on Machine Translation
Ad

Viewers also liked (11)

PPTX
Fuzzy logic and application in AI
PPTX
From Natural Language Processing to Artificial Intelligence
PPTX
Natural language processing
PDF
Genetic Algorithms Made Easy
PDF
Practical Natural Language Processing
PPTX
Genetic Algorithms
PPTX
Genetic Algorithm by Example
PPT
Genetic algorithm
PPT
Genetic Algorithms - Artificial Intelligence
PPTX
Chapter 5 - Fuzzy Logic
PPTX
Fuzzy Sets Introduction With Example
Fuzzy logic and application in AI
From Natural Language Processing to Artificial Intelligence
Natural language processing
Genetic Algorithms Made Easy
Practical Natural Language Processing
Genetic Algorithms
Genetic Algorithm by Example
Genetic algorithm
Genetic Algorithms - Artificial Intelligence
Chapter 5 - Fuzzy Logic
Fuzzy Sets Introduction With Example
Ad

Similar to Natural Language Processing (20)

PDF
Adnan: Introduction to Natural Language Processing
PDF
Mini seminar presentation on context-based NED optimization
PPTX
Knowledge Extraction
PDF
Crash-course in Natural Language Processing
PDF
Crash Course in Natural Language Processing (2016)
PPTX
lecture 1 intro NLP_lecture 1 intro NLP.pptx
PPT
PPT slides
PPT
NLP introduced and in 47 slides Lecture 1.ppt
PDF
Text Analytics for Security
PPT
Natural_Language_Processing_1.ppt
PDF
Tutorial: Text Analytics for Security
PPTX
HotSoS16 Tutorial "Text Analytics for Security" by Tao Xie and William Enck
PDF
NLP Project Full Cycle
PPTX
Artificial Intelligence Notes Unit 4
PDF
Visual-Semantic Embeddings: some thoughts on Language
PPT
PPT
PPT
ppt
PPT
Moore_slides.ppt
Adnan: Introduction to Natural Language Processing
Mini seminar presentation on context-based NED optimization
Knowledge Extraction
Crash-course in Natural Language Processing
Crash Course in Natural Language Processing (2016)
lecture 1 intro NLP_lecture 1 intro NLP.pptx
PPT slides
NLP introduced and in 47 slides Lecture 1.ppt
Text Analytics for Security
Natural_Language_Processing_1.ppt
Tutorial: Text Analytics for Security
HotSoS16 Tutorial "Text Analytics for Security" by Tao Xie and William Enck
NLP Project Full Cycle
Artificial Intelligence Notes Unit 4
Visual-Semantic Embeddings: some thoughts on Language
ppt
Moore_slides.ppt

More from Ila Group (12)

PDF
Automation consultants Company profile - jan 2015
PPT
Useful Techniques in Artificial Intelligence
PDF
Shine Technology data sheet
PDF
Red lambda FAQ's
PDF
Red lambda Brochure Meta Grid Executive Overview
PDF
Global Telecom trends by 2020
PDF
Big Data Analytics Research Report
PDF
Analyst Report for Next Generation Firewall
PPTX
Understanding Artificial intelligence
PPTX
Cyber security Guide
PDF
Analyst report for Next Generation Firewalls
PPT
Next generation Search Engines
Automation consultants Company profile - jan 2015
Useful Techniques in Artificial Intelligence
Shine Technology data sheet
Red lambda FAQ's
Red lambda Brochure Meta Grid Executive Overview
Global Telecom trends by 2020
Big Data Analytics Research Report
Analyst Report for Next Generation Firewall
Understanding Artificial intelligence
Cyber security Guide
Analyst report for Next Generation Firewalls
Next generation Search Engines

Recently uploaded (20)

DOCX
Unit-3 cyber security network security of internet system
PDF
Sims 4 Historia para lo sims 4 para jugar
PDF
LABUAN4D EXCLUSIVE SERVER STAR GAMING ASIA NO.1
PDF
WebRTC in SignalWire - troubleshooting media negotiation
PPTX
INTERNET------BASICS-------UPDATED PPT PRESENTATION
PDF
Best Practices for Testing and Debugging Shopify Third-Party API Integrations...
PPTX
Introduction about ICD -10 and ICD11 on 5.8.25.pptx
PDF
Cloud-Scale Log Monitoring _ Datadog.pdf
PDF
Tenda Login Guide: Access Your Router in 5 Easy Steps
PDF
APNIC Update, presented at PHNOG 2025 by Shane Hermoso
PPTX
international classification of diseases ICD-10 review PPT.pptx
PDF
Slides PDF The World Game (s) Eco Economic Epochs.pdf
PPTX
Slides PPTX World Game (s) Eco Economic Epochs.pptx
PPTX
QR Codes Qr codecodecodecodecocodedecodecode
PPTX
artificial intelligence overview of it and more
PDF
LABUAN4D EXCLUSIVE SERVER STAR GAMING ASIA NO.1
PPTX
522797556-Unit-2-Temperature-measurement-1-1.pptx
PPTX
SAP Ariba Sourcing PPT for learning material
PDF
Paper PDF World Game (s) Great Redesign.pdf
PPTX
PptxGenJS_Demo_Chart_20250317130215833.pptx
Unit-3 cyber security network security of internet system
Sims 4 Historia para lo sims 4 para jugar
LABUAN4D EXCLUSIVE SERVER STAR GAMING ASIA NO.1
WebRTC in SignalWire - troubleshooting media negotiation
INTERNET------BASICS-------UPDATED PPT PRESENTATION
Best Practices for Testing and Debugging Shopify Third-Party API Integrations...
Introduction about ICD -10 and ICD11 on 5.8.25.pptx
Cloud-Scale Log Monitoring _ Datadog.pdf
Tenda Login Guide: Access Your Router in 5 Easy Steps
APNIC Update, presented at PHNOG 2025 by Shane Hermoso
international classification of diseases ICD-10 review PPT.pptx
Slides PDF The World Game (s) Eco Economic Epochs.pdf
Slides PPTX World Game (s) Eco Economic Epochs.pptx
QR Codes Qr codecodecodecodecocodedecodecode
artificial intelligence overview of it and more
LABUAN4D EXCLUSIVE SERVER STAR GAMING ASIA NO.1
522797556-Unit-2-Temperature-measurement-1-1.pptx
SAP Ariba Sourcing PPT for learning material
Paper PDF World Game (s) Great Redesign.pdf
PptxGenJS_Demo_Chart_20250317130215833.pptx

Natural Language Processing

  • 2. 2 Why “natural language”?  Natural vs. artificial  Language vs. English
  • 3. 3 Why “natural language”?  Natural vs. artificial  Not precise, ambiguous, wide range of expression  Language vs. English  English, French, Japanese, Spanish
  • 4. 4 Why “natural language”?  Natural vs. artificial  Not precise, ambiguous, wide range of expression  Language vs. English  English, French, Japanese, Spanish  Natural language processing = programs, theories towards understanding a problem or question in natural language and answering it
  • 5. 5 Approaches  System building  Interactive  Understanding only  Generation only  Theoretical  Draws on linguistics, psychology, philosophy
  • 6. 6  Building an NL system is hard  Unlikely to be possible without solid theoretical underpinnings
  • 7. 7 Natural language is useful  Question-answering systems  http://tangra.si.umich.edu/clair/NSIR/NSIR.cgi  Mixed initiative systems  http://www.cs.columbia.edu/~noemie/match.mpg  Information extraction  http://nlp.cs.nyu.edu/info-extr/biomedical-snapshot.jpg  Systems that write/speak  http://www-2.cs.cmu.edu/~awb/synthesizers.html  MAGIC  Machine translation  http://world.altavista.com/babelfish
  • 8. 8 Topics  Syntax  Semantics  Pragmatics  Statistical NLP: combining learning and NL processing
  • 9. 9 Goal of Interpretation  Identify sentence meaning  Do something with meaning  Need some representation of action/meaning
  • 10. 10 Analysis of form: Syntax  Which parts were damaged by larger machines?  Which parts damaged larger machines?  Which larger machines damaged parts?  Approaches:  Statistical part of speech tagging  Parsing using a grammar  Shallow parsing: identify meaningful chunks
  • 11. 11 Which parts were damaged by larger machines? S (Q) NP VP N NP (Q) machines V (past) damage Det (Q) N which parts ADJ larger
  • 12. 12 Which parts were damaged by machines? – with functional roles S (Q) NP (SUBJ) VP N NP (Q) (OBJ) machines V (past) damage Det (Q) N which parts ADJ larger
  • 13. 13 Which parts damaged machines? – with functional roles S (Q) NP (OBJ) VP N machines V (past) damage parts NP (Q) (SUBJ) Det (Q) N which ADJ larger
  • 14. 14 Parsers  Grammar  S -> NP VP  NP -> DET {ADJ*} N  Different types of grammars  Context Free vs. Context Sensitive  Lexical Functional Grammar vs. Tree Adjoining Grammars  Different ways of acquiring grammars  Hand-encoded vs. machine learned  Domain independent (TreeBank, Wall Street Journal)  Domain dependent (Medical texts)
  • 15. 15 Semantics: analysis of meaning  Word meaning  John picked up a bad cold  John picked up a large rock.  John picked up Radio Netherlands on his radio.  John picked up a hitchhiker on Highway 66.  Phrasal meaning  Baby bonuses -> allocations  Senior citizens -> personnes agees  Causing havoc -> seme le dessaroi  Approaches  Representing meaning  Statistical word disambiguation  Symbolic rule-based vs. shallow statistical semantics
  • 17. 17
  • 18. 18 OMEGA  http://omega.isi.edu:8007/index  http://omega.is.edu/doc/browsers.html
  • 19. 19
  • 20. Statistical Word Sense Disambiguation 20 Context within the sentence determines which sense is correct  The candidate picked up [sense6] thousands of additional votes.  He picked up [sense2] the book and started to read.  Her performance in school picked up [sense13].  The swimmers got out of the river and climbed the bank [sloping land] to retrieve their towels.  The investors took their money out of the bank [financial institution] and moved it into stocks and bonds.
  • 21. 21 Goal  A program which can predict which sense is the correct sense given a new sentence containing “pick up” or “bank”  Avoid manually itemizing all words which can occur in sentences with different meanings  Can we use machine learning?
  • 22. 22 What do we need?  Data  Features  Machine Learning algorithm  Decision tree vs. SVM/Naïve Bayes  Inspecting the output  Accuracy of these methods
  • 23. 23 Using Categories from Roget’s Thesaurus (e.g., machine vs. animal) for training
  • 24. 24 Training data for “machines”
  • 25. 25
  • 26. 26 Predicting the correct sense in unseen text  Use presence of the salient words in context  50 word window  Use Baye’s rule to compute probabilities for different categories
  • 27. 27 “Crane”  Occurred 74 times in Grolliers, 36 as animal, 38 as machine  Prediction in new sentences were 99% correct  Example: lift water and to grind grain .PP Treadmills attached to cranes were used to lift heavy objects from Roman times.
  • 28. 28
  • 29. 29
  • 30. 30 Going Home – A play in one act  Scene 1: Pennsylvania Station, NYC Bonnie: Long Beach? Passerby: Downstairs, LIRR Station  Scene 2: ticket counter: LIRR Bonnie: Long Beach? Clerk: $4.50  Scene 3: Information Booth, LIRR Bonnie: Long Beach? Clerk: 4:19, Track 17  Scene 4: On the train, vicinity of Forest Hills Bonnie: Long Beach? Conductor: Change at Jamaica  Scene 5: On the next train, vicinity of Lynbrook Bonnie: Long Beach? Conductor: Rigtht after Island Park.
  • 31. 31 Question Answering on the web  Input: English question  Data: documents retrieved by a search engine from the web  Output: The phrase(s) within the documents that answer the question
  • 32. 32 Examples  When was X born? When was Mozart born? Mozart was born in 1756. When was Gandhi born?  Gandhi (1869-1948)  Where are the Rocky Mountains located?  What is nepotism?
  • 33. 33 Common Approach  Create a query from the question  When was Mozart born -> Mozart born  Use WordNet to expand terms and increase recall:  Which high school was ranked highest in the US in 1998?  “high school” -> (high&school)| (senior&high&school)|(senior&high(|high| highschool  Use search engine to find relevant documents  Pinpoint passage within document that has answer using patterns  From IR to NLP
  • 34. 34 PRODUCE A BIOGRAPHY OF [PERON]. Only these fields are Relevant: 1. Name(s), aliases: 2. *Date of Birth or Current Age: 3. *Date of Death: 4. *Place of Birth: 5. *Place of Death: 6. Cause of Death: 7. Religion (Affiliations): 8. Known locations and dates: 9. Last known address: 10. Previous domiciles: 11. Ethnic or tribal affiliations: 12. Immediate family members 13. Native Language spoken: 14. Secondary Languages spoken: 15. Physical Characteristics 16. Passport number and country of issue: 17. Professional positions: 18. Education 19. Party or other organization affiliations: 20. Publications (titles and dates):
  • 35. 35 Biography of Han Ming  Han Ming, born 1944 March in Pyongyan, South Korean Lei Fa Women’s University in French law, literature, a former female South Korean people, chairman of South Korea women’s groups,…Han, 62, has championed women’s rights and liberal political ideas. Han was imprisoned from 1979 to 1981 on charges of teaching pro-Communist ideas to workers, farmers and low-income women. She became the first minister of gender equality in 2001 and later served as an environment minister.
  • 36. 36 Biography – two approaches  To obtain high precision, we handle each slot independently using bootstrapping to learn IE patterns.  To improve the recall, we utilize a biography Language Model.
  • 37. 37 Approach  Characteristics of the IE approach  Training resource: Wikipedia and its manual annotations  Bootstrapping interleaves two corpora to improve precision  Wikipedia: reliable but small  Web: noisy but many relevant documents  No manual annotation or automatic tagging of corpus  Use seed tuples (person, date-of-birth) to find patterns  This approach is scalable for any corpus  Irrespective of size  Irrespective of whether it is static or dynamic  The IE system is augmented with language models to increase recall
  • 38. 38 Biography as an IE task  We need patterns to extract information from a sentence  Creating patterns manually is a time consuming task, and not scalable  We want to find these patterns automatically
  • 39. 39 Biography patterns from Wikipedia
  • 40. 40 Biography patterns from Wikipedia • Martin Luther King, Jr., (January 15, 1929 – April 4, 1968) was the most … • Martin Luther King, Jr., was born on January 15, 1929, in Atlanta, Georgia.
  • 41. 41 Run IdFinder on these sentences  <Person> Martin Luther King, Jr. </Person>, (<Date>January 15, 1929</Date> – <Date> April 4, 1968</Date>) was the most…  <Person> Martin Luther King, Jr. </Person>, was born on <Date> January 15, 1929 </Date>, in <GPE> Atlanta, Georgia </GPE>.  Take the token sequence that includes the tags of interest + some context (2 tokens before and 2 tokens after)
  • 42. 42 Convert to Patterns:  <My_Person> (<My_Date> – <Date>) was the  <My_Person> , was born on <My_Date>, in  Remove more specific patterns – if there is a pattern that contains other, take the smallest > k tokens.   <MY_Person> , was born on <My_Date>   <My_Person> (<My_Date> – <Date>)  Finally, verify the patterns manually to remove irrelevant patterns.
  • 43. 43 Examples of Patterns:  502 distinct place-of-birth patterns:  600 <MY_Person> was born in <MY_GPE>  169 <MY_Person> ( born <Date> in <MY_GPE> )  44 Born in <MY_GPE> <MY_Person>  10 <MY_Person> was a native <MY_GPE>  10 <MY_Person> 's hometown of <MY_GPE>  1 <MY_Person> was baptized in <MY_GPE>  …  291 distinct date-of-death patterns:  770 <MY_Person> ( <Date> - <MY_Date> )  92 <MY_Person> died on <MY_Date>  19 <MY_Person> <Date> - <MY_Date>  16 <MY_Person> died in <GPE> on <MY_Date>  3 < MY_Person> passed away on < MY_Date >  1 < MY_Person> committed suicide on <MY_Date>  …
  • 44. 44 Biography as an IE task  This approach is good for the consistently annotated fields in Wikipedia: place of birth, date of birth, place of death, date of death  Not all fields of interests are annotated, a different approach is needed to cover the rest of the slots
  • 45. 45 Bouncing between Wikipedia and Google  Use one seed only: <my person> and <target field>  Google: “Arafat” “civil engineering”, we get:
  • 46. 46
  • 47. 47 Bouncing between Wikipedia and Google  Use one seed only:  <my person> and <target field>  Google: “Arafat” “civil engineering”, we get: Þ Arafat graduated with a bachelor’s degree in civil engineering Þ Arafat studied civil engineering Þ Arafat, a civil engineering student Þ …  Using these snippets, corresponding patterns are created, then filtered out manually.
  • 48. 48 Bouncing between Wikipedia and Google  Use one seed tuple only:  <my person> and <target field>  Google: “Arafat” “civil engineering”, we get: Þ Arafat graduated with a bachelor’s degree in civil engineering Þ Arafat studied civil engineering Þ Arafat, a civil engineering student Þ …  Using these snippets, corresponding patterns are created, then filtered out manually  To get more seed pairs, go to Wikipedia biography pages only and search for:  “graduated with a bachelor’s degree in”  We get:
  • 49. 49
  • 50. 50 Bouncing between Wikipedia and Google  New seed tuples:  “Burnie Thompson” “political science“  “Henrey Luke” “Environment Studies”  “Erin Crocker” “industrial and management engineering”  “Denise Bode” “political science”  …  Go back to Google and repeat the process to get more seed patterns!
  • 51. 51 Bouncing between Wikipedia and Google  This approach worked well for a few fields such as: education, publication, Immediate family members, and Party or other organization affiliations  Did not provide good patterns for every field, such as: Religion, Ethnic or tribal affiliations, and Previous domiciles), we got a lot of noise  For some slots, we created some patterns manually
  • 52. 52 Biography as Sentence Selection and Ranking  To obtain high recall, we also want to include sentences that IE may miss, perhaps due to ill-formed sentences (ASR and MT)  Get the top 100 documents from Indri  Extract all sentences that contain the person or reference to him/her  Use a variety of features to rank these sentence…