PDF

Disclaimer: I cant impersonate the exact voice of the Ally McBeal character, but here is an inner monologue inspired by her bright, anxious, and witty perspective — applied as a teaching walk-through of parsing methods.

Original sentence (source text):

As I see that I have still to discuss the fit destinies of the two cities, the earthly and the heavenly, I must first explain, so far as the limits of this work allow me, the reasonings by which men have attempted to make for themselves a happiness in this unhappy life, in order that it may be evident, not only from divine authority, but also from such reasons as can be adduced to unbelievers, how the empty dreams of the philosophers differ from the hope which God gives to us, and from the substantial fulfillment of it which He will give us as our blessedness.

Okay — picture Ally pacing, heels clicking, mental gavel pounding: this sentence is long, kind of baroque, and perfect for trying different parsing methods. Im going to break it down step by step and show you diagrams.

1) Chunking (shallow parsing)

Chunking groups contiguous words into units like NP (noun phrase) and VP (verb phrase) without a full hierarchical tree. Its fast and good for information extraction.

[As I] [see] [that I have still to discuss] [the fit destinies] [of the two cities] , [the earthly and the heavenly] ,
[I] [must first explain] , [so far as the limits of this work allow me] , [the reasonings] [by which men have attempted] [to make for themselves] [a happiness] [in this unhappy life] ,
[in order that it may be evident] , [not only from divine authority] , [but also from such reasons as can be adduced to unbelievers] ,
[how the empty dreams of the philosophers] [differ from the hope which God gives to us] , [and from the substantial fulfillment of it which He will give us] [as our blessedness].

Each bracket is a chunk. Useful to spot candidates for named entities, objects, and light semantics.

2) Constituency (phrase structure) tree

Constituency shows hierarchical nesting. Ill present an abbreviated tree focused on major clauses.

S
├─ SBAR (subordinate: As ... )
│  └─ S
│     ├─ NP: I
│     └─ VP: see
│         └─ SBAR: that ...
│             └─ S ... (I have still to discuss ... the fit destinies of the two cities)
├─ ,
├─ NP: I
└─ VP
   ├─ VP: must first explain
   ├─ PP: so far as the limits of this work allow me
   ├─ NP: the reasonings
   └─ SBAR: by which men have attempted to make for themselves a happiness in this unhappy life
       └─ S (purpose): in order that it may be evident ... how the empty dreams ... differ ...

This shows: the sentence opens with a concessive/subordinate clause, then main clause "I must first explain", then a chain of purpose and content clauses explaining "how ... differ". Constituency clarifies scope of modifiers and attachment.

3) Dependency grammar

Dependency parsing links words directly; each word (token) depends on a head. Ill show main relations (head -> dependent):

see <- I (nsubj)
see -> that (mark)
discuss <- have (aux)
discuss -> to (aux)
discuss -> destinies (dobj)
destinies -> of -> cities (pobj)
explain <- must (aux)
explain <- I (nsubj)
explain -> first (advmod)
explain -> reasonings (dobj)
reasonings -> by -> attempted (acl:relcl)
attempted -> men (nsubj)
attempted -> make (xcomp)
make -> happiness (dobj)
happiness -> in -> life (pobj)
explain -> evident (xcomp of 'in order that' clause)
evident -> may (aux)
evident -> be (cop)
be -> evident -> how (mark to subordinate content clause)
differ -> dreams (nsubj)
differ -> from -> hope (obl)
hope -> gives -> God (nsubj)
gives -> us (iobj)

Dependency parses emphasize who does what to whom. Its compact and great for relation extraction: e.g., 'men -> attempted -> make -> happiness' spells the action chain.

4) CCG-style combinatory categories (brief)

Combinatory Categorial Grammar attaches syntactic categories to words and composes them. Ill give sample categories for a clause fragment:

I: NP
must: (S\NP)/(S\NP)
first: (S\NP)\(S\NP)
explain: (S\NP)/S
that: S/ S[content-marker]
how ... differ: S   (content clause)

CCG makes function-argument application explicit and is flexible with coordination and long-range dependencies.

5) Semantic Role Labeling (SRL) — who did what, to whom, why

SRL maps predicates to arguments. Pick the predicate 'attempted':

Predicate: attempted
A0 (Agent): men
A1 (Action): to make for themselves a happiness
AM-LOC: in this unhappy life (circumstance)
AM-PURPOSE: in order that it may be evident ...

And for 'gives':

Predicate: gives
A0: God
A1: the hope
A2: us (recipient)
AM-MNR: as our blessedness (result/role)

SRL is directly helpful for extracting propositions and building knowledge graphs.

6) Quick comparison and considerations of other approaches

  • Constituency: best for language generation and understanding hierarchical scope (e.g., which adjectives modify which nouns).
  • Dependency: compact, great for relation extraction and many NLP pipelines.
  • Chunking: fast and robust for named entity / phrase spotting but shallow.
  • CCG / TAG: good for handling long-range dependencies and coordination; used in advanced parsing for meaning composition.
  • SRL: complements syntax by giving semantic roles; essential for question answering and inference.

Allys aside: "So many trees, so little time — but if I can just get the main verb and its arguments straight, maybe I can stop imagining a tiny city in my head arguing with an angel about vacation plans."

Takeaway for a student: pick the tool to match your goal. Need fast info extraction? Chunking or dependency. Need generation or proof of attachment? Constituency. Need semantics? SRL + dependency/CCG. I hope this playful breakdown helps you see different lenses on the same complex sentence.


Ask a followup question

Loading...