how the statistical revolution changes (computational...

62
How the statistical revolution changes (computational) linguistics Mark Johnson EACL workshop March 2009 Thanks to Eugene Charniak and the BLLIP group, and Antske Fokkens, John Maxwell, Paola Merlo, Yusuke Miyao and Mark Steedman. 1 / 62

Upload: others

Post on 25-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

How the statistical revolution changes

(computational) linguistics

Mark Johnson

EACL workshopMarch 2009

Thanks to Eugene Charniak and the BLLIP group, andAntske Fokkens, John Maxwell, Paola Merlo, Yusuke Miyao and

Mark Steedman.

1 / 62

Page 2: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

No warranty . . .

Half the money I spend on advertising is wasted.

The problem is: I don’t know which half.

John Wanamaker

2 / 62

Page 3: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

What is computational linguistics?

• Engineering goals — building useful systems◮ currently relies more heavily on statistics and machine learning

than linguistic theory◮ not all scientific knowledge has engineering applications

• Scientific goals — understanding computational aspects oflinguistic processes

◮ language comprehension, production and acquisition(Chomsky’s “use of knowledge of language”)

3 / 62

Page 4: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Outline

Grammar-based and statistical parsing

From grammar-based to treebank parsing

Features in our rescoring parser

Open-world grammars and parsing

Statistical models of language acquisition and comprehension

Conclusion

4 / 62

Page 5: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Feature-based grammars and parsing

• Circa 1980 GPSG showed how to decompose long-distance“movement” phenomena into sequences of strictly localdependencies (feature-passing)

⇒ Explosion of mathematical and computational work in“unification-based” grammars that continues today

• Parsing with “unification-based” grammars◮ (computational) linguist devises grammar formalism (e.g.,

typed feature structure constraints)◮ linguist writes grammar in grammar formalism◮ computational linguist devises parser (inference engine) for

grammar formalism

5 / 62

Page 6: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Statistical treebank parsing

• In the late 1980s statistical techniques (e.g., HMMs) becamethe dominant approach to speech recognition

• Parsing with statistical treebank parsers◮ linguist annotates a corpus with parses◮ computational linguist devises class of statistical models of

parses (e.g., features of model)◮ computational linguist devises estimation procedure that maps

parsed corpus to statistical model (e.g., feature weights)◮ computational linguist devises parser for statistical model

• Why has statistical treebank parsing come to dominate

computational linguistics?

6 / 62

Page 7: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Both approaches involve linguistic knowledge

• Linguists write the grammars used in manual grammar-basedapproaches

• Linguists write the annotation guidelines and direct theannotation effort in statistical treebank parsing

◮ how do corpus and annotation choices affect our statisticalmodels?

• Corpus-based approach forces annotators to focus on commonconstructions (rather than rare constructions that may bescientifically more interesting)

• Large grammars are difficult to construct and maintain(software engineering)

• Treebanks are inherently redundant⇒ errors seem less catastrophic

7 / 62

Page 8: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Probabilities and grammars

• Abney (1996) showed how to define MaxEnt models overvirtually any linguistic representations

◮ No requirement that features are independent (“context-free”)⇒ No principled reason why we can’t replicate treebank parsing

techniques with grammar-based representations

• Are our probabilistic models appropriate?◮ is it reasonable to model P(Sentence)?◮ perhaps we should estimate (and learn from)

P(Sentence|Meaning) instead?⇒ requires explicit meaning representation

◮ part of attraction of statistical machine translationP(English|French), i.e., French as “language of thought”

8 / 62

Page 9: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Why parse?

• Engineering goal – we hope a general-purpose parser will beuseful for other engineering tasks

⇒ broad coverage⇒ robust⇒ good average case performance

• There are robust, broad-coverage grammar-based parsers◮ HPSG (Miyao and Tsujii, Cholakov et al)◮ LFG (Maxwell and Kaplan, van Genabith and Way)◮ CCG (Hockenmaier, Clark and Curran)

• Complex linguistic representations, but their phrase-structureparsing accuracy is not better than statistical treebank parsers’

• Not that many applications for parsing◮ often used as language model (keep probabilities, throw away

parses)

9 / 62

Page 10: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

S

.NP

NN

pricethe

DTraises

VBZ

VPNP

NNP

Sam

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

10 / 62

Page 11: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.NP

NN

pricethe

DTraises

VBZ

VPNP

NNP

Sam

S

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

11 / 62

Page 12: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.NP

NN

pricethe

DTraises

VPNP

NNP

Sam

VBZ

S

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

12 / 62

Page 13: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.NP

NN

pricethe

DT

VPNP

NNP

Sam

S

VBZ

raises

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

13 / 62

Page 14: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.NP

NN

pricethe

DT

NP

NNP

Sam

S

VBZ

VP

raises

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

14 / 62

Page 15: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.

NN

pricethe

DT

NP

NNP

Sam

S

VBZ

raises

NP

VP .

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

15 / 62

Page 16: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.

pricethe

DT

NP

NNP

Sam

S

VBZ

raises

VP

NP

NN

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

16 / 62

Page 17: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.

the

DT

NP

NNP

Sam

S

VBZ

raises

VP

NP

price

NN

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

17 / 62

Page 18: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.

the

NP

NNP

Sam

S

VP

NP

NN

price

DT

.

VBZ

raises

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

18 / 62

Page 19: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.

NP

NNP

Sam

S

VP

NP

NN

pricethe

DT

.

VBZ

raises

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

19 / 62

Page 20: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

.

NP

NNP

Sam

S

VBZ

raises

VP

NP

DT

the

NN

price

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

20 / 62

Page 21: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

NP

NNP

Sam

S

VBZ

raises

VP

NP

DT

the

NN

price

.

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

21 / 62

Page 22: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

NNP

Sam

S

VBZ

raises

VP

NP

DT

the

NN

price

.

NP .

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

22 / 62

Page 23: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

Sam

S

VBZ

raises

VP

NP

DT

the

NN

price

.NNP

NP .

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

23 / 62

Page 24: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

S

VBZ

raises

VP

NP

DT

the

NN

price

.

NP

NNP

Sam

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

24 / 62

Page 25: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Capturing vs. covering linguistic generalizations

• Capturing a generalization: grammaraccurately describes phenomenon atappropriate level, e.g., subject-verbagreement via PERSON and NUMBER

features

• Covering a generalization: modelcovers common cases of ageneralization, perhaps indirectly.E.g., head-to-head POS dependencies

NP

DT

the

NN

price

.

raises

VBZ

VP

S

NP

NNP

Sam

.

• An “engineering” parser only needs to cover generalizations

• But feature design requires linguistic insight

◮ basic linguistic insights tend to have greatest impact

25 / 62

Page 26: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Outline

Grammar-based and statistical parsing

From grammar-based to treebank parsing

Features in our rescoring parser

Open-world grammars and parsing

Statistical models of language acquisition and comprehension

Conclusion

26 / 62

Page 27: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Stochastic Lexical Functional Grammar

• Parc LFG parser produces a set of parses (c-and f-structure pairs) for input sentence

• A set of “feature extractors” count how oftenvarious structures appear in each parse

◮ Local trees (i.e., phrase-structure rules)◮ Local f-structures (i.e., argument structure

frames)◮ Long-distance dependency paths◮ Misc. features: e.g., Coordination features

• Scorer computes “soft max” of weighted linearcombination of feature values

the cat chased the dog

...

[0, 1, 1, 0, ...],[0, 0, 1, 1, ...],

...

Scorer

0.25, 0.22, ...

Feature extractors

, ,

, ,

− = −

− = −− = −

− = −

XLE parser (PARC)

27 / 62

Page 28: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Features in MaxEnt parsing models

• A feature can be any real-valued function ofthe parses

• As far as the statistical model is concerned,the feature vectors are all that matters

◮ details of linguistic representations areirrelevant so long as they are rich enough tocompute feature vectors

• The most valuable features were defined onc-structures alone

◮ c-structures are finer-grained thanf-structures

◮ All my f-structure features can be recoded as

c-structure features

• Are there important generalizations thatf-structure features cover better?

the cat chased the dog

...

[0, 1, 1, 0, ...],[0, 0, 1, 1, ...],

...

Scorer

0.25, 0.22, ...

Feature extractors

, ,

, ,

− = −

− = −− = −

− = −

XLE parser (PARC)

28 / 62

Page 29: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Discriminative rescoring treebank parser

• Use 50-best parses from Charniak parser

• Log of Charniak’s parse probability is one ofmodel’s features

⇒ Charniak’s features are imported into model⇒ concentrate on non-local features not

included in Charniak’s model

• Trained from more data than SLFG systemwas ⇒ more accurate

◮ these days it is common to traingrammar-based models from PTB translatedinto grammar-based representations

◮ but translated PTB doesn’t contain moreinformation than original PTB did

, ,

[0, 1, 1, 0, ...],[0, 0, 1, 1, ...],

...

Scorer

0.25, 0.22, ...

Feature extractors

the cat chased the dog

Charniak parser

50−best parses

29 / 62

Page 30: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Outline

Grammar-based and statistical parsing

From grammar-based to treebank parsing

Features in our rescoring parser

Open-world grammars and parsing

Statistical models of language acquisition and comprehension

Conclusion

30 / 62

Page 31: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Conditional estimation for parse rescoring

• Easy to over-fit training data with large number of features

⇒ Regularize by adding a penalty term to log likelihood◮ L1 penalty term ⇒ sparse feature weight vector◮ L2 penalty term (Gaussian prior) seems best

• 50-best parses T (s) may not include true parse t⋆(s)

⇒ Train rescorer to prefer parse in Tc(s) closest to t⋆(s)

• Often several parses from 50-best list are equally close to trueparse

⇒ EM-inspired (non-convex) loss function

• Direct numerical optimization with L-BFGS (modified for L1regularizer) produces best results

Riezler, King, Kaplan, Crouch, Maxwell and Johnson (2002),Goodman (2004), Andrew and Gao (2007)

31 / 62

Page 32: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Features for rescoring parses

• Parse rescorer’s features can be any computable function ofparse

• Choice of features is the most important and least understoodaspect of the parser

◮ feature design has a much greater impact on performance thanthe learning algorithm

• Features can be based on a linguistic theory (or more than one)

• . . . but need not be◮ “shot-gun” or “hail Mary” features very useful

• Feature selection: a feature’s values on T ⋆(s) and T (s) \ T ⋆(s)must differ on at least 5 sentences s ∈ D

• The Charniak parser’s log probability combines all of thegenerative parser’s conditional distributions into a singlerescorer feature ⇒ rescoring should never hurt

32 / 62

Page 33: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Lexicalized and parent-annotated rules• Lexicalization associates each constituent with its head• Ancestor annotation provides a little “vertical context”• Context annotation indicates constructions that only occur in main

clause (c.f., Emonds)

ROOT

S

NP

WDT

That

VP

VBD

went

PP

IN

over

NP

NP

DT

the

JJ

permissible

NN

line

PP

IN

for

NP

ADJP

JJ

warm

CC

and

JJ

fuzzy

NNS

feelings

.

.

Heads

Ancestor

Context

Rule

33 / 62

Page 34: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

n-gram rule features generalize rules• Collects adjacent constituents in a local tree• Also includes relationship to head (e.g., adjacent? left or right?)• Parameterized by ancestor-annotation, lexicalization and head-type

• There are 5,143 unlexicalized rule bigram features and 43,480lexicalized rule bigram features

ROOT

S

NP

DT

The

NN

clash

VP

AUX

is

NP

NP

DT

a

NN

sign

PP

IN

of

NP

NP

DT

a

JJ

new

NN

toughness

CC

and

NN

divisiveness

PP

IN

in

NP

NP

NNP

Japan

POS

’s

JJ

once-cozy

JJ

financial

NNS

circles

.

.

Left of head, non-adjacent to head

34 / 62

Page 35: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Bihead dependency features

DT

A

NN

record

VBZ

has

RB

n’t

VP

VBN

been

VP

.

.NN

date

VBN

set

NP VP

S

(S (NP (NN date)) (VP (VBN set)))

• Bihead dependency features approximate linguisticfunction-argument dependencies

• Computed for lexical (≈ semantic) and functional (≈ syntactic)heads

• One feature for each head-to-head dependency found in trainingcorpus (70,000 features in all)

35 / 62

Page 36: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Head trees record all dependencies

• Head trees consist of a (lexical) head, all of its projections and(optionally) all of the siblings of these nodes

• correspond roughly to TAG elementary trees

• parameterized by head type, number of sister nodes andlexicalization

ROOT

S

NP

PRP

They

VP

VBD

were

VP

VBN

consulted

PP

IN

in

NP

NN

advance

.

.

36 / 62

Page 37: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Rightmost branch bias

• The RightBranch feature’s value is the number of nodes on theright-most branch (ignoring punctuation) (c.f., Charniak 00)

• Reflects the tendancy toward right branching in English• Only 2 different features, but very useful in final model!

ROOT

WDT

That went

over

DT

the

JJ

permissible

NN

line

IN

for

JJ

warm

CC

and

JJ

fuzzy

NNS

feelings

.

.PP

VP

S

NP

PP

NP

NP

VBD

IN

NP

ADJP

37 / 62

Page 38: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Constituent Heavyness and location

• Heavyness measures the constituent’s category, its (binned) sizeand (binned) closeness to the end of the sentence

ROOT

S

NP

WDT

That

VP

VBD

went

PP

IN

over

NP

NP

DT

the

JJ

permissible

NN

line

PP

IN

for

NP

ADJP

JJ

warm

CC

and

JJ

fuzzy

NNS

feelings

.

.

> 5 words =1 punctuation

38 / 62

Page 39: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Coordination parallelism (1)

• A CoPar feature indicates the depth to which adjacentconjuncts are parallel

ROOT

S

NP

PRP

They

VP

VP

VBD

were

VP

VBN

consulted

PP

IN

in

NP

NN

advance

CC

and

VP

VDB

were

VP

VBN

surprised

PP

IN

at

NP

NP

DT

the

NN

action

VP

VBN

taken

.

.

Isomorphic trees to depth 4

39 / 62

Page 40: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Coordination parallelism (2)

• The CoLenPar feature indicates the difference in length inadjacent conjuncts and whether this pair contains the lastconjunct.

ROOT

S

NP

PRP

They

VP

VP

VBD

were

VP

VBN

consulted

PP

IN

in

NP

NN

advance

CC

and

VP

VDB

were

VP

VBN

surprised

PP

IN

at

NP

NP

DT

the

NN

action

VP

VBN

taken

.

.

4 words

6 wordsCoLenPar feature: (2,true)

40 / 62

Page 41: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Word

• A Word feature is a word plus n of its parents (c.f., Klein andManning’s non-lexicalized PCFG)

• A WProj feature is a word plus all of its (maximal projection)parents, up to its governor’s maximal projection

ROOT

S

NP

WDT

That

VP

VBD

went

PP

IN

over

NP

NP

DT

the

JJ

permissible

NN

line

PP

IN

for

NP

ADJP

JJ

warm

CC

and

JJ

fuzzy

NNS

feelings

.

.

41 / 62

Page 42: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Constituent “edge neighbour” featuresS

NP

WDT

That

VP

VBD

went

PP

NP

JJ

permissible

NN

line

PP

IN

for

NP

ADJP

JJ

warm

CC

and

JJ

fuzzy

NNS

feelings

.

.

DT

the

over

IN NP

(IN over) (NP (DT the . . . ))

• Edge features are a kind of bigram context for constituents

• Would be difficult to incorporate into a generative parser42 / 62

Page 43: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Tree n-gram

• A tree n-gram feature is a tree fragment that connect sequencesof adjacent n words, for n = 2, 3, 4 (c.f. Bod’s DOP models)

• lexicalized and non-lexicalized variants

• There are 62,487 tree n-gram features

ROOT

S

NP

WDT

That

VP

VBD

went

PP

IN

over

NP

NP

DT

the

JJ

permissible

NN

line

PP

IN

for

NP

ADJP

JJ

warm

CC

and

JJ

fuzzy

NNS

feelings

.

.

43 / 62

Page 44: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Experimental setup

• Feature tuning experiments done using Collins’ split:sections 2-19 as train, 20-21 as dev and 24 as test

• Tc(s) computed using Charniak 50-best parser

• Features which vary on less than 5 sentences pruned

• Optimization performed using LMVM optimizer fromPetsc/TAO optimization package or Averaged Perceptron

• Regularizer constant c adjusted to maximize f-score on dev

44 / 62

Page 45: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Evaluating features

• The feature weights are not that indicative of how important afeature is

• The MaxEnt ranker with regularizer tuning takes approx 1 dayto train

• The averaged perceptron algorithm takes approximately 2minutes

◮ used in experiments comparing different sets of features◮ Used to compare models with the following features:

NLogP Rule NGram Word WProj RightBranch Heavy

NGramTree HeadTree Heads Neighbours CoPar

CoLenPar

45 / 62

Page 46: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Adding one feature class

• Averaged perceptron baseline with only base parser log probfeature

◮ section 20–21 f-score = 0.894913◮ section 24 f-score = 0.889901

46 / 62

Page 47: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Subtracting one feature class

• Averaged perceptron baseline with all features◮ section 20–21 f-score = 0.906806◮ section 24 f-score = 0.902782

47 / 62

Page 48: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Feature selection is hard!

Averaged perceptron feature selection

f-score on sections 20-21

f-sc

ore

onse

ctio

n24

0.9110.910.9090.9080.9070.9060.9050.9040.9030.9020.901

0.908

0.906

0.904

0.902

0.9

0.898

0.896

0.894

0.892

• Greedy feature selection using averaged perceptron optimizing f-scoreon sec 20–21

• All models also evaluated on section 2448 / 62

Page 49: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Comparing estimators

• Training on sections 2–19, regularizer tuned on 20–21, evaluateon 24

Estimator # features sec 20-21 sec 24

MaxEnt model, p = 2 670,688 0.9085 0.9037MaxEnt model, p = 1 14,549 0.9078 0.9024averaged perceptron 523,374 0.9068 0.9028expected f-score 670,688 0.9084 0.9029

• None of the differences are significant

• Because the exponential model with p = 2 was the first model Itested new features on, they may be biased to work well with it.

49 / 62

Page 50: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Outline

Grammar-based and statistical parsing

From grammar-based to treebank parsing

Features in our rescoring parser

Open-world grammars and parsing

Statistical models of language acquisition and comprehension

Conclusion

50 / 62

Page 51: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Conventional grammars are closed-world• The closed world assumption for grammars

◮ Rules and lexical entries define set of grammatical structures◮ Everything not grammatical is ungrammatical

• “Parsing as deduction”: parsing is the process of proving thegrammaticality of a sentence

• But: goal is understanding what the speaker’s trying to say;not determining whether the sentence is grammatical

• My ideal parser would be open-world◮ even ungrammatical sentences are interpretable

E.g., man bites dog 6= dog bites man◮ words and constructions we recognize provide information

about sentence’s meaning◮ unknown words or phrases do not cause interpretation to fail

– parsing and acquisition are two aspects of same process

• Statistical treebank parsers are open-world◮ every possible tree receives positive probability

51 / 62

Page 52: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Open-world grammar-based parsing via

relaxation

• Replace “hard” unification equality constraints with “soft”probabilistic features

◮ e.g., for subject-verb agreement, introduce feature that fireswhen subject’s agreement disagrees with verb’s agreement

• Most techniques for making grammar-based parsersbroad-coverage perform grammar or lexicon relaxation

◮ Unknown word lexical entry guesser◮ Fragment parsing mechanisms

• Under this approach, grammaticality does not play a centralrole in parsing

• Statistical treebank parsers relax via smoothing andMarkovization

52 / 62

Page 53: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Open-world grammar-based parsing via

disfluency model

• Intuition: Ungrammatical sentences areunderstood by analogy with grammatical ones

• Possible formulation: noisy channel model wherechannel generates disfluencies

◮ source model only generates grammatical

sentences

• We’ve used this kind of model to detect andcorrect disfluencies in speech transcripts ,

,

,

grammatical anddisfluent analyses

Noisy channel(disfluency model)

grammatical analyses

Grammar

the man bites the dog

the man bites the dog

man bites dog

53 / 62

Page 54: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Outline

Grammar-based and statistical parsing

From grammar-based to treebank parsing

Features in our rescoring parser

Open-world grammars and parsing

Statistical models of language acquisition and comprehension

Conclusion

54 / 62

Page 55: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Statistical models in linguistics

• Scientific side of computational linguistics studies thecomputational aspects of language

◮ computation (the meaning-respecting manipulation ofmeaning-bearing symbols) is a process

⇒ computational linguistics bears most directly on languagecomprehension, production and acquisition

• Recent explosion of interest in:◮ computational psycholinguistics (esp. sentence processing)

(Bachrach, Hale, Keller, Levy, Roark, etc.)◮ computational models of language acquisition (esp. phonology

and morphology)(Albright, Boersma, Frank, Goldsmith, Hayes, Pater, etc.)

55 / 62

Page 56: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Optimality theory

• Smolensky’s Optimality Theory (OT) seems exquisitelydesigned to permit numerophobes perform optimization

• Detailed OT analyses of many aspects of phonology andmorphology

• Optimality Theory is a limiting case of Harmony Theorywhich are equivalent to MaxEnt models

◮ OT constraints ≡ MaxEnt features

⇒ Easy for linguists to move from OT analyses to MaxEnt models

56 / 62

Page 57: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Chomsky ought to be a Bayesian!

P(Grammar | Data)︸ ︷︷ ︸

Posterior

∝ P(Data | Grammar)︸ ︷︷ ︸

Likelihood

P(Grammar)︸ ︷︷ ︸

Prior

• Likelihood measures how well grammar describes data

• Prior expresses knowledge of grammar before data is seen◮ can be very specific (e.g., Universal Grammar), or◮ can be generic (e.g., prefer shorter grammars)◮ can express markedness preferences that can be overridden by

data

• Posterior is distribution over grammars◮ expresses uncertainty about which grammar is correct

• In principle we can empirically evaluate utility of specificlinguistic universals in language acquisition

57 / 62

Page 58: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Bayesian updating integrates parsing and learning

P(Grammar | Sentence1:n)︸ ︷︷ ︸

Updated grammar

∝ P(Sentencen | Grammar)︸ ︷︷ ︸

≈ parsing

P(Grammar | Sentence1:n−1)︸ ︷︷ ︸

Previous grammar

• Incremental Bayesian belief updating updates posterior as eachdata item is encountered

• With e.g. PCFGs the update process requires computingexpected rule counts, and adding these to rule count sums.

• Sampling the sentence’s likely parses is a general way ofcomputing these expectations

⇒ Learning might be a low-cost by-product of parsing

58 / 62

Page 59: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Non-parametric Bayesian inference

• Most standard learning methods are based on optimizing a

function of a finite vector of real-valued parameters

• Learn rules or structures by iterating:◮ generate hopefully useful rules/structures◮ estimate their parameters via a parametric estimator◮ prune low-probabilty rules/structures

• Non-parametric Bayesian methods provide a theoreticallysound framework for identifying relevant finite subset frominfinite set of entities

◮ example: learn finite set of phonemic forms from infinite set ofpotential phonemic forms

◮ Monte Carlo sampling methods

59 / 62

Page 60: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Computational models of word learning

• During their “word spurt”, children learn words in “one-shot”

• No obvious limit on number of possible words⇒ non-parametric Bayes

• Learn to segment words from unsegmented broad phonemic

transcription (Brent)

◮ Example: y u w a n t t u s i D 6 b u k

• Goldwater (2006) learns a bigram model of word sequenceswithout being told what the words are

◮ Learning inter-word dependencies improves word segmentation◮ Learning intra-word structure (syllable structure) also helps

• There are learning algorithms whose computationally mostdemanding component is sampling parses of input sequences

60 / 62

Page 61: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Outline

Grammar-based and statistical parsing

From grammar-based to treebank parsing

Features in our rescoring parser

Open-world grammars and parsing

Statistical models of language acquisition and comprehension

Conclusion

61 / 62

Page 62: How the statistical revolution changes (computational ...web.science.mq.edu.au/~mjohnson/papers/johnson-eacl09-worksho… · (computational) linguistics Mark Johnson EACL workshop

Conclusion

• Be clear about why you’re building any computational model◮ usual engineering goals ⇒ average case matters◮ open-world parsing (unknown words, fragment parsing)◮ don’t have to capture a generalization in order to cover it◮ what does determining grammaticality have to do with

comprehension?

• Statistical models of processing and acquisition are having amajor impact

◮ probabilistic parsing models are having an increasing impact inpsycholinguistics and neurolinguistics

◮ (non-parametric) Bayesian models provide new ways of viewinglanguage acquisition

62 / 62