SlideShare a Scribd company logo
L-R Feature
Structure Unification
Syntactic Parser
Richard Caneba
RPI Cognitive Science Department
Human-Level Intelligence Laboratory
Intuitions
• An interpretive grammar views syntax as finding the most
  appropriate sequence of head and dependency relationship
  between phrases and words.
• Language understanding occurs (roughly) left to right
• Syntactic trees have a flat structure, that gives no syntactic
  preferences to sequences of adjunctive modifiers of the same
  category (adjectives, adverbs, modifying prepositional
  phrases)
• We can infer a number of things immediately from the
  perception of a weird, although by no means all things
Intuitions cont’d
• There are many patterns that exist in natural language, that
  can be deterministic in some cases, and must be
  defeasible/probabilistic in others.
• Reliably deterministic:
  • [Det N] => NP[Det N]
  • [Adj N] => NP[Adj N]
• Defeasible:
  • *V NP NP…+ (<1.0)> VP*V NP NP…+
  • *V NP NP…+ (<1.0)> VP*V NP*NP…+…+
• Make an attempt to do search ONLY if there is a genuine
  ambiguity as to what the next step in a L-R parse should be
  • Second object/Relative clause modifier in ditransitive context
  • Prepositional phrase attachment
Feature Structure Unification
• A traditional challenge with the HPSG theory of grammar is
  that, in order to preserve the recursiveness of their grammar
  rules, they were required to have a “right-branching”
  structure that posited additional feature structure nodes for
  each dependency-head relationship the theory posits
• This is to some extent slightly cognitively unrealistic:
  • Posits an unecessary amount of structure for a syntactic parse
  • Intuitively there is no syntactic distinction that should be made
    between sequences of adjuncts (it’s hard to tell the difference
    between “the angry green dog” and the “green angry dog.”
Lexical Representation of
Syntax
• Each word posits a sequence of head-dependency
  relationships that form a “phrasal chain.”
• These chains are based on the notion that we can infer
  immediately some head-dependency relationships based on
  the syntactic category of the word.
• Roughly, each node in a chain is of three types (not explicitly
  defined in the lexicon, but nonetheless present):
  • Word Level (WordUtteranceEvent)
  • Dependency Level (PhraseUtteranceEvent)
  • HeadLevel (PhraseUtteranceEvent)
Lexical Representation of
Syntax
• Let’s do a quick example to show the lexical syntactic
  representation:
• “the angry dog”
• With part-of-speech tags, that is:
• [Det the][Adj angry][N dog].
• The representation in di-graph form:
Lexical Representation of Syntax
              PhraseUtteranceEvents

              WordUtteranceEvents
                             Syntactic Entry for a Common Noun

                                CandType               CandType      Verb
               Preposition



                                             PartOf    IsA
                  Noun
                                CandType      PartOf
                                                                     Noun




                     IsA         Specifier             IsA
 Determiner                                                       CommonNoun
                                              Phon




                                                dog
Lexical Representation of Syntax
      PhraseUtteranceEvents

      WordUtteranceEvents
                  Syntactic Entry for an Adjective


                      CandType            IsA         Noun
          Noun



                                 PartOf
                                          IsA
                                                     Adjective
                                 Phon




                                 angry




   NOTE: will need to posit a dependency layer, to account for adverbs that
   modify the adjective i.e. “really big”.
Lexical Representation of Syntax
    PhraseUtteranceEvents

    WordUtteranceEvents
                   Syntactic Entry for a Determiner

                      CandType             CandType     Verb
     Preposition



                                 PartOf
                      CandType             IsA
        Noun                                            Noun
                                  PartOf




                                           IsA
                                                      Determiner
                                 Phon




                                     the
Grammar Rules
• In our example, we will need to have at least two rules:
  • One that unifies the structures posited by the determiner to the
    structures posited by the common noun
  • One that unifies the structures posited by adjective, either to the
    determiner or the noun
  • Let’s consider this from L-R:
     • First, unify the Det-NP-XP structure chain to the Adj-NP structure
       chain
     • Next, unify that resulting structure chain to the N-NP-XP structure
       chain
Grammar Rules
• Determiner-Adjective Rule


                 CandType
   Preposition
      Verb

                            PartOf                CandType
      Noun
                 CandType              Noun

                                                      IsA




                                                             PartOf
      Noun          IsA                Noun
                             PartOf




                    IsA               Adjective
                                                     IsA
   Determiner




                                                             Phon
                            Phon




                                the                          angry
Grammar Rules
• Determiner-Adjective Rule

                 CandType                     Same
   Preposition
      Verb

                            PartOf                   CandType
      Noun
                 CandType              Noun

                                                         IsA




                                                                PartOf
      Noun          IsA                Noun
                             PartOf




                    IsA               Adjective
                                                        IsA
   Determiner




                                                                Phon
                            Phon




                                the                             angry
Grammar Rules
• Determiner-Adjective Rule


                                  CandType            CandType
                  Verb                                                     Preposition




                                             PartOf
                  Noun             IsA                CandType               Noun




                     IsA                                             IsA            Adjective
     Determiner




                                                             Phon
                           Phon




                             the                             angry
Grammar Rules
• We would like to allow for anywhere from 0-infinite number
  of adjectives to stand between the determiner and the noun
  that selects the determiner as its specifier.
• We can achieve this by explicitly stating that whenever a Det
  chain and an Adj chain are unified, it’s exposed as a
  determiner on the right wall of the growing parse, as opposed
  to an adjective.
Grammar Rules
• Determiner-Adjective Resulting Structure


                                  CandType            CandType
                  Verb                                                     Preposition




                                             PartOf
                  Noun             IsA                CandType               Noun




                     IsA                                             IsA            Adjective
     Determiner




                                                             Phon
                           Phon




                             the                             angry
Grammar Rules
• Determiner-Adjective Resulting Structure + NP


                                  CandType                                                CandType
             Verb                                                    CandType
                                                       Preposition                                     Verb




                                             PartOf
         Preposition




                                                                                PartOf
                                                                                                       Noun

                                   IsA                               CandType
             Noun                                           Noun

             Noun




                                                                                 PartOf
                                                      IsA                                   IsA
                    IsA                  Adjective                                                   CommonNoun
Determiner




                                                                                  Phon
                                                            Phon
                          Phon




                            the                             angry                   dog
Grammar Rules
• Expose the resulting structure from the Det-Adj unification as
  just the Det structure:




                 XP                            XP



                NP                             NP




          Det                Adj        Spr         N
Grammar Rules
• Expose the resulting structure from the Det-Adj unification as
  just the Det structure:



                  Border


                 XP                            XP
                                                          Frontier

                NP                             NP




          Det                Adj        Spr         N
Grammar Rules
• Expose the resulting structure from the Det-Adj unification as
  just the Det structure:



                  Border


                 XP                             XP
                                   Same                   Frontier

                NP                              NP
                            Same




          Det                Adj          Spr        N
Grammar Rules
<!--Pre-head Adjective Modifier w/ Det: Shift Border-->   <!--Subcategorization Rules: NP Specifier-->
 <constraint shouldFalsify="false">                        <constraint shouldFalsify="false">
             Border(?ba, ?t0, ?w)^                                     Border(?ba, ?t0, ?w)^
             Border(?bb, ?t0, ?w)^                                     Border(?bb, ?t0, ?w)^
             Frontier(?fa, ?t1, ?w)^                                   Frontier(?fa, ?t1, ?w)^
             Frontier(?fb, ?t1, ?w)^                                   Frontier(?fb, ?t1, ?w)^
             Meets(?t0, ?t1, E, ?w)^                                   Meets(?t0, ?t1, E, ?w)^
             PartOf(?ba, ?bb, E, ?w)^                                  PartOf(?ba, ?bb, E, ?w)^
             PartOf(?fa, ?fb, E, ?w)^                                  PartOf(?fa, ?fb, E, ?w)^
             IsA(?ba, Determiner, E, ?w)^                              IsA(?ba, Determiner, E, ?w)^
             IsA(?bb, Noun, E, ?w)^                                    IsA(?bb, Noun, E, ?w)^
             IsA(?fa, Adjective, E, ?w)^                               Specifier(?fa, ?spr, E, ?w)^
             IsA(?fb, Noun, E, ?w)                                     IsA(?spr, Determiner, E, ?w)^
             ==>                                                       IsA(?fb, Noun, E, ?w)^
             Same(?bb, ?fb, E, ?w)^                                    Heard(?wue, E, ?w)^
             Border(?ba, ?t1, ?w)^                                     IsA(?wue, WordUtteranceEvent, ?t1, ?w)
</constraint>                                                          ==>
                                                                       Same(?ba, ?spr, E, ?w)^
                                                                       Same(?bb, ?fb, E, ?w)^
                                                                       Border(?wue, ?t1, ?w)^
                                                                       _NPSPR(?ba, ?bb, ?fa, ?fb, E, ?w)
                                                           </constraint>
Grammar Rules




   *send+ *john+ *a+ *message+ *that+ *says+ *“hi”+.
Grammar Rules




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules


          XP      XP        XP           NP                 XP




 VP       NP      NP        NP           VP        VP       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules

     VP




                           NP




                                           VP

          NP

                                                       NP




[V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
Grammar Rules
• Benefits of this feature-structure unification parse:
    •   Captures the intuition that when we hear a word, and posit its feature structure, we can infer the
        existence of not only the word’s direct feature structure (usually generated by lexical rules) but also
        the existence of additional structures and their head/dependency relationships, and some definition
        of the values in the structure.
    •   Ambiguities (i.e. the head of an NP) are resolved from L-R through lazy definitions and unificiation of
        under-defined structures to well-defined structures in terms of particular features.
    •   Posits no more additional structures in the parse tree than is necessary in order to reflect a parse,
        whereas theories like HPSG posited by a large number of structures in a branching tree in order to
        preserve the recursivity of its grammar rules.
    •   However, we have shown that with feature structure unification, at least in theory, we can preserve
        recursivity of many of the rules without requiring a left or right branching structure.
    •   All of the necessary structure to build a parse are known from the beginning.
Grammar Rules!
• The future:
   •   Ungrammaticality: when objects aren’t where they are supposed to be, search for a likely head-
       dependency relationship
         •   Missing arguments: “Car is big.”
         •   Extra words (rare to have full content words be considered extra, but occurs in natural language: “I saw the, um,
             car.”)
         •   Dependents out of order: “Give the car me.”
         •   Dangling dependent: “
         •   Will require a good branch and bound system, that only performs search when what is expected/predicted
             reasonably is violated.
   •   Give a feature-structure unification account of garden path sentence
         •   Should be fairly natural given the L-R predictive nature of the parser
   •   Attach a semantic representation that generates word-sense based on head-dependency
       relationships.
         •   Syntax should be closely tied to semantics, in that both serve to help compute each other to varying degrees.
   •   Examine discourse from a syntactic perspective, and syntax from a discourse perspective, and use to
       disambiguate simultaneously:
Notes on Theory (boring)
• By having a lexical representation that is closely tied to the syntax, a number of advantages
  fall out:
   • Parsimony: by allowing a lot of information to be loosely defined/undefined at the lexical
     level, we do not need to posit additional lexical entries to cover all possible configurations of
     a phrases arguments in the entry, nor do we need an excessive number of lexical rules to
     generate these representations.
   • Generativity: a word’s sense is at least in part generated by its relationship to its dependents
     and head, and the semantic/syntactic type that these dependents/heads have in theory can
     compute a words sense on the fly (inspired by GL theory from Pustejovsky).
   • Context embedding: by tying your theory of the lexicon closely to syntactic theory, you move
     towards embedding your lexical representation in a cognitive system that is closely tied to the
     way words are ACTUALLY used.
Lexical Mosaics
• Thus, we can see that the sense of words comes from a number of
  different locations:
  • Memory
  • Syntactic context
  • Pragmatic/Discourse factors
• It is the hope for future research to tie these together in an
  organized way to give a theory on lexical representation that is tied
  closely to these factors, in a computable and tractable manner.
• Early goals:
  • Compute word senses from syntactic context + memory (very
    difficult)
  • Use syntactic context to disambiguate lexical ambiguity
  • Use generative word sense to disambiguate syntactic ambiguity
  • Simultaneously attempt to give a computational account of lexical
    memory, syntactic parsing, and pragmatic/discourse.

More Related Content

PDF
Simonovic arsenijevic - in and out of paradigms - bcn2013
PPT
Comparatives superlatives
PPT
Comparatives superlatives
PPT
Comparatives superlatives
PPT
Comparatives superlatives
PPTX
Principles and parameters of grammar report
PPT
Verb to be Jesus Payan
PPTX
Root Diagram by Slideshop
Simonovic arsenijevic - in and out of paradigms - bcn2013
Comparatives superlatives
Comparatives superlatives
Comparatives superlatives
Comparatives superlatives
Principles and parameters of grammar report
Verb to be Jesus Payan
Root Diagram by Slideshop

What's hot (20)

DOCX
ACTIVIDAD 3
PPTX
Venn Diagram Green by Slideshop
PPTX
Syntax III
PDF
Formal languages
PPT
85906 633561392036875000
PDF
On Japanese Resultatives: Some Cross-linguistic Implications
PPTX
Phonetics activities
PDF
April 20 24
PPTX
Modeling Improved Syllabification Algorithm for Amharic
DOCX
ACTIVIDAD 7
PPT
Irregular verbs
PDF
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
PDF
Basic arabic grammar
PDF
Paren free
PDF
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
PDF
Reading comprehension 1
PPT
Case theory
DOC
Active and passive voice
PPTX
Headnoun vgp do&io
DOC
Sadia Qamar (Assignment)
ACTIVIDAD 3
Venn Diagram Green by Slideshop
Syntax III
Formal languages
85906 633561392036875000
On Japanese Resultatives: Some Cross-linguistic Implications
Phonetics activities
April 20 24
Modeling Improved Syllabification Algorithm for Amharic
ACTIVIDAD 7
Irregular verbs
IMPORTANCE OF VERB SUFFIX MAPPING IN DISCOURSE TRANSLATION SYSTEM
Basic arabic grammar
Paren free
Statistical Dependency Parsing in Korean: From Corpus Generation To Automatic...
Reading comprehension 1
Case theory
Active and passive voice
Headnoun vgp do&io
Sadia Qamar (Assignment)
Ad

Viewers also liked (7)

PPTX
Syntactic Structure of Predication: An Instroduction
PDF
The Semantic Processing of Syntactic Structure in Sentence Comprehension
PPTX
Structures of Predication Introduction
PPTX
Syntactic structures
PPTX
Syntax
PPTX
Structures of Modification!
Syntactic Structure of Predication: An Instroduction
The Semantic Processing of Syntactic Structure in Sentence Comprehension
Structures of Predication Introduction
Syntactic structures
Syntax
Structures of Modification!
Ad

Similar to Feature Structure Unification Syntactic Parser 2.0 (7)

KEY
Level 1 Analysis
PDF
Constituency Tests
PPS
Speech 100517215702-phpapp01 (1)
PPT
Parts of speech
PPS
Parts of speech
PPS
PARTS OF SPEECH
PPT
unit -3 part 1.ppt
Level 1 Analysis
Constituency Tests
Speech 100517215702-phpapp01 (1)
Parts of speech
Parts of speech
PARTS OF SPEECH
unit -3 part 1.ppt

Recently uploaded (20)

PDF
Diabetes mellitus diagnosis method based random forest with bat algorithm
PDF
Encapsulation_ Review paper, used for researhc scholars
PPT
Teaching material agriculture food technology
PDF
The Rise and Fall of 3GPP – Time for a Sabbatical?
PDF
gpt5_lecture_notes_comprehensive_20250812015547.pdf
PPTX
Machine Learning_overview_presentation.pptx
PPTX
Spectroscopy.pptx food analysis technology
PPTX
A Presentation on Artificial Intelligence
PDF
Building Integrated photovoltaic BIPV_UPV.pdf
PPTX
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
PDF
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
PPTX
Programs and apps: productivity, graphics, security and other tools
PPTX
MYSQL Presentation for SQL database connectivity
PDF
MIND Revenue Release Quarter 2 2025 Press Release
PDF
Machine learning based COVID-19 study performance prediction
PDF
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
PPTX
sap open course for s4hana steps from ECC to s4
PPTX
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
PDF
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
DOCX
The AUB Centre for AI in Media Proposal.docx
Diabetes mellitus diagnosis method based random forest with bat algorithm
Encapsulation_ Review paper, used for researhc scholars
Teaching material agriculture food technology
The Rise and Fall of 3GPP – Time for a Sabbatical?
gpt5_lecture_notes_comprehensive_20250812015547.pdf
Machine Learning_overview_presentation.pptx
Spectroscopy.pptx food analysis technology
A Presentation on Artificial Intelligence
Building Integrated photovoltaic BIPV_UPV.pdf
KOM of Painting work and Equipment Insulation REV00 update 25-dec.pptx
Profit Center Accounting in SAP S/4HANA, S4F28 Col11
Programs and apps: productivity, graphics, security and other tools
MYSQL Presentation for SQL database connectivity
MIND Revenue Release Quarter 2 2025 Press Release
Machine learning based COVID-19 study performance prediction
Architecting across the Boundaries of two Complex Domains - Healthcare & Tech...
sap open course for s4hana steps from ECC to s4
VMware vSphere Foundation How to Sell Presentation-Ver1.4-2-14-2024.pptx
Peak of Data & AI Encore- AI for Metadata and Smarter Workflows
The AUB Centre for AI in Media Proposal.docx

Feature Structure Unification Syntactic Parser 2.0

  • 1. L-R Feature Structure Unification Syntactic Parser Richard Caneba RPI Cognitive Science Department Human-Level Intelligence Laboratory
  • 2. Intuitions • An interpretive grammar views syntax as finding the most appropriate sequence of head and dependency relationship between phrases and words. • Language understanding occurs (roughly) left to right • Syntactic trees have a flat structure, that gives no syntactic preferences to sequences of adjunctive modifiers of the same category (adjectives, adverbs, modifying prepositional phrases) • We can infer a number of things immediately from the perception of a weird, although by no means all things
  • 3. Intuitions cont’d • There are many patterns that exist in natural language, that can be deterministic in some cases, and must be defeasible/probabilistic in others. • Reliably deterministic: • [Det N] => NP[Det N] • [Adj N] => NP[Adj N] • Defeasible: • *V NP NP…+ (<1.0)> VP*V NP NP…+ • *V NP NP…+ (<1.0)> VP*V NP*NP…+…+ • Make an attempt to do search ONLY if there is a genuine ambiguity as to what the next step in a L-R parse should be • Second object/Relative clause modifier in ditransitive context • Prepositional phrase attachment
  • 4. Feature Structure Unification • A traditional challenge with the HPSG theory of grammar is that, in order to preserve the recursiveness of their grammar rules, they were required to have a “right-branching” structure that posited additional feature structure nodes for each dependency-head relationship the theory posits • This is to some extent slightly cognitively unrealistic: • Posits an unecessary amount of structure for a syntactic parse • Intuitively there is no syntactic distinction that should be made between sequences of adjuncts (it’s hard to tell the difference between “the angry green dog” and the “green angry dog.”
  • 5. Lexical Representation of Syntax • Each word posits a sequence of head-dependency relationships that form a “phrasal chain.” • These chains are based on the notion that we can infer immediately some head-dependency relationships based on the syntactic category of the word. • Roughly, each node in a chain is of three types (not explicitly defined in the lexicon, but nonetheless present): • Word Level (WordUtteranceEvent) • Dependency Level (PhraseUtteranceEvent) • HeadLevel (PhraseUtteranceEvent)
  • 6. Lexical Representation of Syntax • Let’s do a quick example to show the lexical syntactic representation: • “the angry dog” • With part-of-speech tags, that is: • [Det the][Adj angry][N dog]. • The representation in di-graph form:
  • 7. Lexical Representation of Syntax PhraseUtteranceEvents WordUtteranceEvents Syntactic Entry for a Common Noun CandType CandType Verb Preposition PartOf IsA Noun CandType PartOf Noun IsA Specifier IsA Determiner CommonNoun Phon dog
  • 8. Lexical Representation of Syntax PhraseUtteranceEvents WordUtteranceEvents Syntactic Entry for an Adjective CandType IsA Noun Noun PartOf IsA Adjective Phon angry NOTE: will need to posit a dependency layer, to account for adverbs that modify the adjective i.e. “really big”.
  • 9. Lexical Representation of Syntax PhraseUtteranceEvents WordUtteranceEvents Syntactic Entry for a Determiner CandType CandType Verb Preposition PartOf CandType IsA Noun Noun PartOf IsA Determiner Phon the
  • 10. Grammar Rules • In our example, we will need to have at least two rules: • One that unifies the structures posited by the determiner to the structures posited by the common noun • One that unifies the structures posited by adjective, either to the determiner or the noun • Let’s consider this from L-R: • First, unify the Det-NP-XP structure chain to the Adj-NP structure chain • Next, unify that resulting structure chain to the N-NP-XP structure chain
  • 11. Grammar Rules • Determiner-Adjective Rule CandType Preposition Verb PartOf CandType Noun CandType Noun IsA PartOf Noun IsA Noun PartOf IsA Adjective IsA Determiner Phon Phon the angry
  • 12. Grammar Rules • Determiner-Adjective Rule CandType Same Preposition Verb PartOf CandType Noun CandType Noun IsA PartOf Noun IsA Noun PartOf IsA Adjective IsA Determiner Phon Phon the angry
  • 13. Grammar Rules • Determiner-Adjective Rule CandType CandType Verb Preposition PartOf Noun IsA CandType Noun IsA IsA Adjective Determiner Phon Phon the angry
  • 14. Grammar Rules • We would like to allow for anywhere from 0-infinite number of adjectives to stand between the determiner and the noun that selects the determiner as its specifier. • We can achieve this by explicitly stating that whenever a Det chain and an Adj chain are unified, it’s exposed as a determiner on the right wall of the growing parse, as opposed to an adjective.
  • 15. Grammar Rules • Determiner-Adjective Resulting Structure CandType CandType Verb Preposition PartOf Noun IsA CandType Noun IsA IsA Adjective Determiner Phon Phon the angry
  • 16. Grammar Rules • Determiner-Adjective Resulting Structure + NP CandType CandType Verb CandType Preposition Verb PartOf Preposition PartOf Noun IsA CandType Noun Noun Noun PartOf IsA IsA IsA Adjective CommonNoun Determiner Phon Phon Phon the angry dog
  • 17. Grammar Rules • Expose the resulting structure from the Det-Adj unification as just the Det structure: XP XP NP NP Det Adj Spr N
  • 18. Grammar Rules • Expose the resulting structure from the Det-Adj unification as just the Det structure: Border XP XP Frontier NP NP Det Adj Spr N
  • 19. Grammar Rules • Expose the resulting structure from the Det-Adj unification as just the Det structure: Border XP XP Same Frontier NP NP Same Det Adj Spr N
  • 20. Grammar Rules <!--Pre-head Adjective Modifier w/ Det: Shift Border--> <!--Subcategorization Rules: NP Specifier--> <constraint shouldFalsify="false"> <constraint shouldFalsify="false"> Border(?ba, ?t0, ?w)^ Border(?ba, ?t0, ?w)^ Border(?bb, ?t0, ?w)^ Border(?bb, ?t0, ?w)^ Frontier(?fa, ?t1, ?w)^ Frontier(?fa, ?t1, ?w)^ Frontier(?fb, ?t1, ?w)^ Frontier(?fb, ?t1, ?w)^ Meets(?t0, ?t1, E, ?w)^ Meets(?t0, ?t1, E, ?w)^ PartOf(?ba, ?bb, E, ?w)^ PartOf(?ba, ?bb, E, ?w)^ PartOf(?fa, ?fb, E, ?w)^ PartOf(?fa, ?fb, E, ?w)^ IsA(?ba, Determiner, E, ?w)^ IsA(?ba, Determiner, E, ?w)^ IsA(?bb, Noun, E, ?w)^ IsA(?bb, Noun, E, ?w)^ IsA(?fa, Adjective, E, ?w)^ Specifier(?fa, ?spr, E, ?w)^ IsA(?fb, Noun, E, ?w) IsA(?spr, Determiner, E, ?w)^ ==> IsA(?fb, Noun, E, ?w)^ Same(?bb, ?fb, E, ?w)^ Heard(?wue, E, ?w)^ Border(?ba, ?t1, ?w)^ IsA(?wue, WordUtteranceEvent, ?t1, ?w) </constraint> ==> Same(?ba, ?spr, E, ?w)^ Same(?bb, ?fb, E, ?w)^ Border(?wue, ?t1, ?w)^ _NPSPR(?ba, ?bb, ?fa, ?fb, E, ?w) </constraint>
  • 21. Grammar Rules *send+ *john+ *a+ *message+ *that+ *says+ *“hi”+.
  • 22. Grammar Rules [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 23. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 24. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 25. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 26. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 27. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 28. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 29. Grammar Rules XP XP XP NP XP VP NP NP NP VP VP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 30. Grammar Rules VP NP VP NP NP [V send] [N john] [Det a] [N message] [RelP that] [V says] [Q “hi”+.
  • 31. Grammar Rules • Benefits of this feature-structure unification parse: • Captures the intuition that when we hear a word, and posit its feature structure, we can infer the existence of not only the word’s direct feature structure (usually generated by lexical rules) but also the existence of additional structures and their head/dependency relationships, and some definition of the values in the structure. • Ambiguities (i.e. the head of an NP) are resolved from L-R through lazy definitions and unificiation of under-defined structures to well-defined structures in terms of particular features. • Posits no more additional structures in the parse tree than is necessary in order to reflect a parse, whereas theories like HPSG posited by a large number of structures in a branching tree in order to preserve the recursivity of its grammar rules. • However, we have shown that with feature structure unification, at least in theory, we can preserve recursivity of many of the rules without requiring a left or right branching structure. • All of the necessary structure to build a parse are known from the beginning.
  • 32. Grammar Rules! • The future: • Ungrammaticality: when objects aren’t where they are supposed to be, search for a likely head- dependency relationship • Missing arguments: “Car is big.” • Extra words (rare to have full content words be considered extra, but occurs in natural language: “I saw the, um, car.”) • Dependents out of order: “Give the car me.” • Dangling dependent: “ • Will require a good branch and bound system, that only performs search when what is expected/predicted reasonably is violated. • Give a feature-structure unification account of garden path sentence • Should be fairly natural given the L-R predictive nature of the parser • Attach a semantic representation that generates word-sense based on head-dependency relationships. • Syntax should be closely tied to semantics, in that both serve to help compute each other to varying degrees. • Examine discourse from a syntactic perspective, and syntax from a discourse perspective, and use to disambiguate simultaneously:
  • 33. Notes on Theory (boring) • By having a lexical representation that is closely tied to the syntax, a number of advantages fall out: • Parsimony: by allowing a lot of information to be loosely defined/undefined at the lexical level, we do not need to posit additional lexical entries to cover all possible configurations of a phrases arguments in the entry, nor do we need an excessive number of lexical rules to generate these representations. • Generativity: a word’s sense is at least in part generated by its relationship to its dependents and head, and the semantic/syntactic type that these dependents/heads have in theory can compute a words sense on the fly (inspired by GL theory from Pustejovsky). • Context embedding: by tying your theory of the lexicon closely to syntactic theory, you move towards embedding your lexical representation in a cognitive system that is closely tied to the way words are ACTUALLY used.
  • 34. Lexical Mosaics • Thus, we can see that the sense of words comes from a number of different locations: • Memory • Syntactic context • Pragmatic/Discourse factors • It is the hope for future research to tie these together in an organized way to give a theory on lexical representation that is tied closely to these factors, in a computable and tractable manner. • Early goals: • Compute word senses from syntactic context + memory (very difficult) • Use syntactic context to disambiguate lexical ambiguity • Use generative word sense to disambiguate syntactic ambiguity • Simultaneously attempt to give a computational account of lexical memory, syntactic parsing, and pragmatic/discourse.