Quarterly (March, June, September, December)
160 pp. per issue
6 3/4 x 10
2014 Impact factor:

Computational Linguistics

Paola Merlo, Editor
September 2014, Vol. 40, No. 3, Pages 725-731.
(doi: 10.1162/COLI_a_00201)
@ 2014 Association for Computational Linguistics
Charles J. Fillmore
Article PDF (114.59 KB)

Charles J. Fillmore died at his home in San Francisco on February 13, 2014, of brain cancer. He was 84 years old. Fillmore was one of the world's pre-eminent scholars of lexical meaning and its relationship with context, grammar, corpora, and computation, and his work had an enormous impact on computational linguistics. His early theoretical work in the 1960s, 1970s, and 1980s on case grammar and then frame semantics significantly influenced computational linguistics, AI, and knowledge representation. More recent work in the last two decades on FrameNet, a computational lexicon and annotated corpus, influenced corpus linguistics and computational lexicography, and led to modern natural language understanding tasks like semantic role labeling.

Fillmore was born and raised in St. Paul, Minnesota, and studied linguistics at the University of Minnesota. As an undergraduate he worked on a pre-computational Latin corpus linguistics project, alphabetizing index cards and building concordances. During his service in the Army in the early 1950s he was stationed for three years in Japan. After his service he became the first US soldier to be discharged locally in Japan, and stayed for three years studying Japanese. He supported himself by teaching English, pioneering a way to make ends meet that afterwards became popular with generations of young Americans abroad. In 1957 he moved back to the United States to attend graduate school at the University of Michigan.

At Michigan, Fillmore worked on phonetics, phonology, and syntax, first in the American Structuralist tradition of developing what were called “discovery procedures” for linguistic analysis, algorithms for inducing phones or parts of speech. Discovery procedures were thought of as a methodological tool, a formal procedure that linguists could apply to data to discover linguistic structure, for example inducing parts of speech from the slots in “sentence frames” informed by the distribution of surrounding words. Like many linguistic graduate students of the period, he also worked partly on machine translation, and was interviewed at the time by Yehoshua Bar-Hillel, who was touring US machine translation laboratories in preparation for his famous report on the state of MT (Bar-Hillel 1960).

Early in his graduate career, however, Fillmore read Noam Chomsky's Syntactic Structures and became an immediate proponent of the new transformational grammar. He graduated with his PhD in 1962 and moved to the linguistics department at Ohio State University. In his early work there Fillmore developed a number of early formal properties of generative grammar, such as the idea that rules would re-apply to representations in iterative stages called cycles (Fillmore 1963), a formal mechanism that still plays a role in modern theories of generative grammar.

But his greatest impact on computational linguistics came from the line of research that began with his early work on case grammar (Fillmore 1966, 1968, 1971, 1977a). Fillmore had become interested in argument structure by studying Lucien Tesnière's groundbreaking Éléments de Syntaxe Structurale (Tesnière 1959) in which the term ‘dependency’ was introduced and the foundations were laid for dependency grammar. Like many transformational grammarians of the time, Fillmore began by trying to capture the relationships between distinct formal patterns with systematically related meanings; and he became interested in the different ways of expressing the object and recipient of transfer in sentences like “He gave a book to me” and “He gave me a book” (Fillmore 1962, 1965), a phenomenon that became known as dative movement. He then expanded to the more general goal of representing how the participants in an event are expressed syntactically, as in these two sentences about an event of opening:

  • The janitor will open the door with this key

  • This key will open the door


Fillmore noticed that despite the differing syntactic structure, in both sentences key plays the role of the instrument of the action and door the role of the object, patient, or theme, and suggested that such abstract roles could constitute a shallow level of meaning representation. Following Tesnière's terminology, Fillmore first referred to these argument roles as actants (Fillmore 1966) but quickly switched to the term case, (see Fillmore (2003)) and proposed a universal list of semantic roles or cases (Agent, Patient, Instrument, etc.), that could be taken on by the arguments of predicates. Verbs would be listed in the lexicon with their ‘case frame’, the list of obligatory (or optional) case arguments.

The idea that semantic roles could provide an intermediate level of semantic representation that could help map from syntactic parse structures to deeper, more fully-specified representations of meaning was quickly adopted in natural language processing, and systems for extracting case frames were created for machine translation (Wilks 1973), question-answering (Hendrix, Thompson, and Slocum 1973), spoken-language understanding (Nash-Webber 1975), and dialogue systems (Bobrow et al. 1977). General-purpose semantic role labelers were developed to map to case representations via ATNs (Simmons 1973) or, from parse trees, by using dictionaries with verb-specific case frames (Levin 1977; Marcus 1980). By 1977 case representation was widely used and taught in natural language processing and artificial intelligence, and was described as a standard component of natural language understanding in the first edition of Winston's (1977) textbook Artificial Intelligence.

In 1971 Fillmore joined the linguistics faculty at the University of California, Berkeley, and by the mid-1970s he began to expand his ideas on case. He arrived at a more general model of semantic representation, one that expressed the background contexts or perspectives by which a word or a case role could be defined. He called this new representation a frame, and later described the intuition as follows:

“The idea behind frame semantics is that speakers are aware of possibly quite complex situation types, packages of connected expectations, that go by various names—frames, schemas, scenarios, scripts, cultural narratives, memes—and the words in our language are understood with such frames as their presupposed background.” (Fillmore 2012, p. 712)

He described the name as coming from “the pre-transformationalist view of sentence structure as consisting of a frame and a substitution list,” but the word frame seemed to be in the air for a suite of related notions proposed at about the same time by Minsky (1974), Hymes (1974), and Goffman (1974), as well as related notions with other names like scripts (Schank and Abelson 1975) and schemata (Bobrow and Norman 1975) (see Tannen [1979] for a comparison). Fillmore was also influenced by the semantic field theorists and by a visit to the Yale AI lab where he took notice of the lists of slots and fillers used by early information extraction systems like DeJong (1982) and Schank and Abelson (1977).


Fillmore's version of this new idea—more linguistic than other manifestations, focusing on the way that words are associated with frames—was expressed in a series of papers starting in the mid-1970's (Fillmore 1975a, 1976, 1977b, 1982, 1985). His motivating example was the Commercial Event frame, in which a seller sells goods to a buyer, the buyer thus buying the goods that cost a certain amount by paying a price charged by the seller. The definition of each of these verbs (buy, sell, cost, pay, charge), is interrelated by virtue of their joint association with a single kind of event or scenario. The meaning of each word draws in the entire frame, and by using (or hearing) the word, a language user necessarily activates the entire frame. As Fillmore put it:

If I tell you that I bought a new pair of shoes, you do not know where I bought them or how much they cost, but you know, by virtue of the frame I have introduced into our discourse, that there have got to be answers to those questions. (Fillmore 1976, p. 29)

Fillmore also emphasized the way that frames could represent perspectives on events, such that verbs like sell or pay emphasize different aspects of the same event, or that the differences between alternative senses of the same word might come from their drawing on different frames. Fillmore's linguistic interpretation of frames influenced work in artificial intelligence on knowledge representation like KRL (Bobrow and Winograd 1977), and the perspective-taking aspect of frames had a strong influence on work on framing in linguistics and politics (Lakoff 2010).


In 1988 Fillmore taught at the computational linguistics summer school in Pisa run by the late Antonio Zampolli and met the lexicographer Beryl T. Atkins. The two began a collaboration to produce a frame description for the verb risk based on corpus evidence (Fillmore and Atkins 1992). This work, including an invited talk at ACL 1991 (Fillmore and Atkins 1991), influenced the development of other projects in corpus-based lexical semantics (Kipper, Dang, and Palmer 2000; Kipper et al. 2008).

Fillmore became interested in this idea that corpus linguistics, lexicography, and lexical semantics could fruitfully be combined (Fillmore 1992) and when he officially retired from UC Berkeley in 1995 he moved to the International Computer Science Institute (ICSI) in Berkeley (although still teaching at UC Berkeley part-time) and began work on the FrameNet project of computational corpus lexicography that combined his early ideas on semantic roles with his later work on frames and his recent interest in corpus lexicography.

The idea of FrameNet was to build a large set of frames, each of which consisted of lists of constitutive roles or “frame elements”: sets of words that evoke the frame, grammatical information expressing how each frame element is realized in the sentence, and semantic relations between frames and between frame elements. Corpora were annotated with the evoking words, frames, and frame elements (Baker, Fillmore, and Lowe 1998; Fillmore, Johnson, and Petruck 2003; Fillmore and Baker 2009).

Over the next 20 years until his death, Fillmore and his students and colleagues, especially under the direction of Collin Baker, proceeded to create the frames and hand-annotate the corpora. This period of his career was a productive and enjoyable one for Fillmore. In an interview for the ICSI Newsletter, he said

“The happiest time of my career has been here at ICSI, where FrameNet has made it possible for me to work with a team of bright young people on a continuing basis doing work that I'll never lose interest in.”


The combination of rich linguistic annotation and corpus-based approach instantiated in FrameNet, together with the PropBank semantic-role-labeled corpus created soon afterwards by Martha Palmer and colleagues (Palmer, Kingsbury, and Gildea 2005), led to a revival of automatic approaches to semantic role labeling, first on FrameNet (Gildea and Jurafsky 2000) and then on PropBank data (Gildea and Palmer 2002, inter alia). The problem first addressed in the 1970s by hand-written rules was thus now generally recast as one of supervised machine learning. The resulting plethora of systems for performing automatic semantic role labeling (see the surveys in Palmer, Gildea, and Xue (2010) and Màrquez et al. (2008)) have been applied widely to improve the state of the art in tasks across NLP such as question answering (Shen and Lapata 2007; Surdeanu, Ciaramita, and Zaragoza 2011) and machine translation (Liu and Gildea 2010; Lo et al. 2013). Fillmore's FrameNet project also led to the development of FrameNets for many other languages including Spanish, German, Japanese, Portuguese, Italian, and Chinese. And in a perhaps appropriate return to the discovery procedures that first inspired Fillmore, modern work has focused on ways to induce semantic roles from corpora without role annotation (Swier and Stevenson 2004; Chambers and Jurafsky 2009, 2011; Lang and Lapata 2014).

In addition to his work in semantics, Fillmore had significant contributions to syntax and pragmatics, including the influential Santa Cruz Lectures on Deixis (Fillmore 1975b) and a long-standing research project in developing Construction Grammar, a theory—or perhaps more accurately family of theories—that represented a grammar as a collection of constructions, pairings of meaning, and form (Fillmore, Kay, and O'Connor 1988). He also contributed to the application of linguistics to other disciplines including cognitive science, education, and law. Ackerman, Kay, and O'Connor (2014) offer more discussion of these aspects of Fillmore's work.

Fillmore was much honored during his career; he was a fellow of the American Academy of Arts and Sciences, served as president of the Linguistic Society of America, was awarded an honorary doctorate from the University of Chicago, had festschrifts and conferences in his honor, received the ACL lifetime achievement award in 2012 (see the text of his acceptance speech in Fillmore [2012]) and, together with Collin Baker, the Antonio Zampolli Prize from ELRA in 2012. Nonetheless, he was unpretentious (universally referred to even by his undergraduates as “Chuck”), modest, embarrassed by compliments, and generally referred to himself light-heartedly as an Ordinary Working Linguist. His Minnesota background (he was Norwegian on his mother's side) always led to Lake Wobegon comparisons, especially given his often bemused smile and wry deadpan wit. His colleague George Lakoff tells the story: “When he first came to Berkeley in 1971, he encountered a culture defined by the then-commonplace expression, ‘Let it all hang out.’ His response was to wear a button saying, ‘Tuck it all back in.’”

Fillmore was also a favorite teacher and mentor who enjoyed working with what he often capitalized as “Young People”; and was deeply respected for his brilliance, careful attention to detail, and encyclopedic knowledge of language, and universally beloved for his warmth, generosity, and patience. He is survived by his beloved wife Lily Wong Fillmore, a retired Berkeley linguist and Education professor, their children and grandchildren, and a wide community of fond former colleagues, students, and collaborators, among whom I am proud to include myself.